var/home/core/zuul-output/0000755000175000017500000000000015134426603014531 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015134432226015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000232537015134432056020265 0ustar corecore.4rikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD e>KYɋI_翪|mvşo#oVݏKf+ovpZjb% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR ώI8&xėv=E|;FmZl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&Va%ĉUHSR0=>u)oQCC;^u'}8H0]+ES,n?UU{ x~ʓOy_>?/>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQXV-/p:MP\<=<^越a/bz?ܓvjIg3MN4:]U]STa,@OKd9]iSCQ&s~In/SZ % 'I Ƿ$#stV \'xMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlWW q>fu6+'}xu\Veelz`Zbym gp8펠ˋֆ:1IC8qٞ\vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v)vR/egBhAAdh]4H:nV$tHI98/)=mͭ ڐn}}~ק?g_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;n#y 9D*A$$"^)dVQ.(rO6ӟZw_Ȣaޒu'- ^_,G;U\cAAz7EtlLuoXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-qCfqsn[?_r=V:х@mfVg,w}QJUtesYyt7Yr+"*DtO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳oƄ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f/FXJRF>i XʽAQGwG%mgo 恤hˍJ_SgskwI\t`ﶘ080ƱQŀllKX@116fqo>NrU Ѣ9*|ãeeH7.z!<7zG4p9tV|̢T`˖E ;;,tTaIUle*$!>*mBA2,gJIn_kSz)JC]?X(OPJS3.}clݨ{e!MB,cB߮4af祋,1/_xq=fBRO0P'֫-kbM6Apw,GO2}MGK'#+սE^dˋf6Y bQEz}eҏnr_ ^O^W zw~Ȳ=sXअy{E|wTLHUG\sA֎Xpljlz,wf2t?*mDnCrat}Fp*ɨӅ :UqM2r:9c t X1lmĪo玓,R%Zl;IA4S#x,T3Xju~-y9V'1Yݨq]eAU_"6r| w,?VMq؛7;qpU5Tnj j|lN$q:wU$U>L)NC*<` Co)ĉJآUz]gQ)vBی:D`&jDk\7XD&?]\9ȢG:$1`*q n8>%Ml%İȖb?AޗuV3A7ำqE*\qb'YpuHƩҬVnm=Ɂ-2=|5ʹ zq 8$ ׹U>8bK0&V\ t!ku`k\c0h&)IV })p| +fjI`bv0Gea[Xѓ V;/?v9W"Q_4IAٍKK7'l[ QVm0c<%UEhZ_.1b~|n2ͦ_DQP/2 re%_cRw~r9_7*rn |c/V&, ,B82^WK9EHLPm))2.9ȱ  QAcBC,|I*0>Cjp2Kh}?l@VCmĔ,Ah%ShՉ8Y4czt~$)*5!`0ݘ/l+1L#B8U ֕&*?6{դ}Y(INBKhx*MOemTK%1,jX4ֻ->L!NUy*Gۓ KmmlTzcc[O`uxOp  |T!|ik3NȱSW/ @a#A8.ݻ^XڋXٝ:^Izq. ٽƎDnٻBc5hMt{3#i3RA-ٽ9| CwCTfp> 6M/>_x %ۚzn~);qU9GDT!t 6\CW:VlnEUdn6U͇KU;V`ZUݵޙEOWR]Tp2E_ޯ#Vb4rရ\sΎC/T"*!愨Cplm@+@kSUXƽt01:)6SL*"؄*/) ̞r21/y? bO]?}!vy3ޯMPu>o>$lrw۷&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿOR;gAm!?Z֒F`!2XY_-ZLR`ѻeH~dٮ&+IhYRc:r?cOA;?U] 1^: Rl$.4E1J\8ʁ,Gulʂ+lh)qd!e³5d ¢ku|M"kP-&"kZ4xMILv7Ull- })>Z6|<VD h5^6eM>y̆@ x!Zh?EV"sd!@ВUh ^p%pO5[|Bve'{d+ ZsXcv)?W`*|\v aVTu? :VN6YEs5Ȭ N *7{!fRБBSۘ– Eos/IGU}APQTm|XN X]bKjKdpH U6[3TTX)|*H'6vU0:VMmBl&`{ qBAgPS}E`ǧ́JE_!]8$ 5NxK5?Wq1egI+ I.*F~H!Gf"LD&Um/6Gd#fR*cz,—rw=3~rmsL&ԙy=Ezl1#Xخ;R;+[$4pjfњ lݍ3)`xvcZRT\%fpNV AS*Xr#v6*+Avwv-Ζ/K 8WPll;f[WJ|8(iA ä>nl"Zv^VH2ԶzwRC>h{e{i:ա-Ɲe/.I0\lo+_VscxN^SPiM 9ViίE|S`p[,̀+ åzP0\8 MSP( ( =XGۚ@ccEpR?H0F>v6A*zxLOM-&\~%&;I-S)`0嚹<ք+! s1):VqI[+SE j7ogQFa6V[%ZniE|nZ9&-I,*qȎlgc/Lhnٓw'Xm Rύ-~l|XvjҩJ;}]|Bޙ>ǖߔ 3\ a-`M96FZXgsreHhlw+ [CrA&'9~ l}`0aJq- ed;W¨:Ä&]䒿e[M?8|$Ȃ!L-%{ք{dɱL;V[bp>>o&աIJX1*YȸU3Q,͸*E%59sT:ɟڭ2kgwEUD3uK黲=r 9V|MX#8j0"t \5Ȕa|)n"Tq?E8V ׫z[v_}OO-DcĥF7FJeq+N: '3JDcSK쉵?7*uБ:hF \_JK\?e3tj>YSD1౱UR]UR,:Ɇ2 <8"˓+؂L2 KwR8C/I7D Oj!;V|a|`Us%Q)M^ CLu+."T#yrHhlكڼE-X'I^= bKߚԘ!"+< gb`[Cѻ?(_AIpq=S<(ljȣ'3K?e Z?ڢ8VSZMCpnNjqL f2D?mzq*[~Gb𻾏-}[f8dBոVs6傊zF"dadY(R+q%so 9Xe>sY~ am9] x*7 zjC.5Wg󵸊y!1Ua5=taU>ƔCmp-7^m斻[h$~Bdz0jKXq"mũɸHNd"yc Ptu>x2;W`_VZ l0VmWh#ZYBK]nc˝߂~[jRuo[|["w;?2Y :WLg-Ӂ ;UAG1 '3Jٵ Q;ASUȿjek3XLHV sR ҕNE@Ubc@ ۮOȩlea߽C&kwf}}+_jVo,?̸7Nj.v\חq,on[{Uw7z(H&{9`o>i%L8fG(Ȋ!{TT%81Oa'$wm_QJxr&NRSS.%> ٞ5HJ|I3Itf=ZC~'GIXUgOvX&o$MbJ/4"X܆*.#*F-J)zslMegSyOyL"cUM/MP=HcUf.ZE"(LM&;ͱ@ƪ.ߏo~^]b|,X?/?]q< Q2=3Lj>zLBY(o~ اE|3Ns$9y蔔ay;));1zSRv٩V#K8ELNB.$p_+c/3IW"˓ʥĎ~lOa<`r##DSvK'j ܓ9c|J8a礯I04C$$c2n~@ 8N?GTmFu D(dH]F_~ g)5siKj_/-gj`]hliP֣j?b@D*L`zӡK-,@"MF"kܟ#=7+I=sO<O/Q9wX@!O kyVLfwU%Q Οó*uo#Ongd"рY|P*)weMD%<T"$YPŭ^^&*bZ2 K퀡,㲆գ62G>#yc#m3v#&,qCbqc"iάQ3?'/4o# ܮK4|=)Qo%5UHf~ E#*Ŗz>/ E/0֬%(oJCfaǠ /e6 eLZ/&YդHAIț''JXLꅜ%IUȠd>wzE2\H9YuQ0Y-j>%x|Kj]ayEu6r³ęj TKGJtCJuaSq YIIXg]w( xʣzKN'$Q,e/ALb5u% N՝O0:+c{BpS..Rly|\8ןQ0JN'E髿uG ~xRDͤ#'~9#} |ٞd{SL|]9󗔐'GSdT.q' ѳ4H|%g.:n9ڝ#5wJ/e=i23ϳv1Qx9yc^^%OC h[ Fu^[+'w)zq Ano<ݶbNj_~xSCɺ ZM88`xnLO]|۫-"!^ݩ7qu(;]dQؙ^{s{MxRvy\-!Gߐօ? XM&ps _Nwo_fs'aL'#oj*\6WLCr`zOD$A{ VCpQ,d*n㞍?DBog~vCl*y^܁ .^PP:_Jf6y-P8Uѳ&'mKoy' 4%xƋBFMQ*i3@^5uXl?MbD̫2X\Hxy=WG߼Aq@)~|uQ.ZnXH(6zyI7@d}ûJ(_sA8b˥Eߴk3r$ ‘8ЋhK $ ~} iŀ׹B8VQnkЛ3 6m.`LƋXSy5%Nf"-pǃ=r].üŒ|K "tˇCƟk }kYJ]L* tͽr8B}EZ$ ɟY&C|G"g u&"d r,iDrQC&.aZtx6[GgY;ɢɮ.5 ">^,|#jrZDI*MCAr4M\MQ =(,3HVYqF!GY‹"=MU3 G$YJa0m`GGAAx ށP \ȹJ&?:XGq`ƝhWK8X;|w"*,qc"I _r(¹ +U׼5]?g`J a+"QUf0RƸXpEuFx 3u|8-[q>e*̮ԡdV,әeXT7vgC泻ph6S4iʆq ܶq j1¶u|ԽFp|#MyTeo1>gǙ` UNqԸ7Ra -mOaB0C{ Ijv5K>cq`O8N:M,@˸˻ -W HMKMN5EV*,QZ'o[= JuY @K|?@3# b2ف3+xp=Bu+h O|1~?"68ۘ(pm'vV6G"u iiGw^m)n!/3uhռ! #!T osXPB>~un3ՅGh=$z&߉lwt> 2FuQY}Z65jR9~SVĊGmsj `I]H ,`C2MR#;KwEjl4x@K)GLԞ%Uw޷{8rwq2M)Ywc3Se C@l {8TG& &jb[,GG[M+m,/Z*e=s$TzG8>$.[յ( %[nly/)0-e#P})bͅέz䕔bCK5Wq Djki\*1eqO&jv,U<$w#Nx) G2bږVP1"3:/V{N~:Ev1uwuX)GGn0Hܡ;џ"/zv't0\ѱv%[tqNcDw.ňP]He!&ק؆".jj"*"u\#ޟ4L2 @opvjJ0]UXM"iݯ S\ZWD}<@"[#_hwڵzMuƪf4p֌'?Ŧbn:ESw4Rb N5tJ\6%ԝv+u.^y"zwlP_@&fMQGWo/L:;^sKA]R8bTKl?-d&kWbG%gk׏ޖ D?#;wpt]y<=uu+A>ܝv';I_6YrZ=2 ,Q؆B^=xgEm{t% Oc1):e,>.$ qǨn-*eU.'hq_ V5 a]Rau6`_UuSHjHW_ -{S*E/"T*.s<!:[Em5C'A%d?Ҽɸ\\3 Fp]rp?Q 3v!`JǶ!~i4ɡY&b4I1SvMǪs1kP0Fp9tbg~o?ll?t]pz{k]3LD:!v &=7`Nןża,oɉÿ0G B6xDdw ^gKs>$TӶC/)%0ASDaa"q22P8rz(s+anWŪ7ϒX}8Ǹ&N|۪")Ԓuo7@R(Ĝd"$@>(mw%cA|2A~a.9~a> N \oMryG﫰@pW8"Ւ;N4_La6b\#qv ,QY?y3/Wa=Ųw?-5_aN[oV=]s I|ͻ2cEބ \}l2AETg MGǘ~TQk̇0:{uzX4AWcnWR.%C|kJ2;ZeV>o8)[>>"^|U =͊tԕ㰬e|Ц#pI?mR9ZھuCM[1(ֻ\V hiw[y|'Ӄ7y￞mbरC13tpp &["^΁_~0N8 [,B/Č|+AKN}ilKWΈEv+f㽸;00G :/|i A,;SzDЙ!.93({c!A`Obbޱl}ܟ't 0 pĦ&0 1x]q0.Z$ =Ǻ[N2ȝ&A(nLAN%DD^S1ImAG+ȿ\reѬ^(teI,kc:4&.!4Yu+U Dv 2iw#@F;gfrH@:c$gO9M@0Er3݆ Hnp[$ \vcNB=J`늏ӡ,IGv%BBr]n=a➰{m0s:o*E7n*eY \(qb_z˻Vw.r@u& 2hnYJ30ɀD)8қ_0]s[ARDܑR+Rc|7 a.σ>JX1`q:&gG)AtVH#ӑ8";D70tk*(]Ig]^;>܎,XWWs__~(^+"y0Nq =E\ۣIZ<۫&ԎgvL9-qԏp|̋`eRMzpH;t9R 2sdy f˰7(Z^bh]hJq7)*|b9'0-Ȩ'܄`U*sZEHڟ&oWΪޕFgnMNFjy5>dZ( pj5G*%$CyΕor7M1۳Y b;'?rq~ .NxfIUQڣ>NgrQڄ ud3.*bQhqiѱ^o.e%ddg#![rx@^hQRXHzz(,,h]԰g̏7h{N;..MQ6)}YT({GH*;ٯZ")z$ ,A+"<.`uU9ݺͦe0^ }tӁ+=(}E ]믪/AfєpO;nwUq?k4*@(" fz5g֌F.ƚFmYπ%2[)"0$O %!'G he^X`U$k:,>*8}Y$Vut(QƠU'$Vy_Β^z[o=UUO..^:==7^ܰ'ܘRIgt*ݲWsnԯ;v}Nn۷Mz-S6 mN+ێTLJeB ʟI(߀PL؀P9b;B3 ,l@9v:$ـPwPwB u#}&z˄zmN3 6 _&߀PsBIrPrsBvg*WKm:gbI+Kx IjsfDcUs49 l: KvOGu& mܦ|>#UEH}9[9"ɍqgRh5^-}N+}&I? h\vT`^߱+0;,LXϟB$`ZeBާNme>A}FیӳkEA<89jO.&WzU8Sކ,+E30j E ٷ4-q?/$ϓ"ѿ#9mDuԶ_nկ.ګ>(!MrD6O 4c$w0du dVC4AmDI4# v7S}XڳXCERY~9NdAK%L׶Ǐg(?'[HH?_UR,uV_DcSڄjb>:[3`௓Q8CWMa2wƺ-_[_uru>|ٳdʑ%ܳyܬdqsX Ρ-fh/^'=tQcpob x+.u>{Tּ1!1V~/-*}dZY[~{igT 'SMѢm+; 50#~+ k9YnLSWQ6M}t6*[?uUj喇Xdpڷn0"Q"y@˙rL#1GY*ӬA,QnFv 7*&ƍJ隱r@6A֛j ؇(j&>g!?4zU W; L*|X0̉t^FkEWvֿV\ wsĂ)K⋎1jw7 A00 j,zGHBd4V@Orf;A#`RkP; # #x wɌlH>JYO@tŠqo74y߱퟊*LZ:}kw٫H.Лm^5 Z]$_֭ZIv=(gjWxяt^N۽a.üanKI[/T+>iLEe,jtY U: ڷ"-9;=/ü}Yƞ;xFkBBݞ.M\Nvp+[puE⼀*G!rSx+%L@jFg+UnEݨU}~P!AEkJϴp^:6KD STjm`[, jykl{1ti啓>;_4kbCowVܓp4CGqLψ3W4MAMՐYDbI%yr eLe1L!!U άtU0>oQvcl?NQ(y%^y&P_Au;][㟀^}a 끶wIUBY;S)a> hgK\\Q%纶cwP!ntO9}d8St`;K4)豽5NI(VN:;x0WiζIM]+ e-v.+RšgI.U=d#!]hm|T/j GQU$a^nkEnGkuZ$?ۼǰLcg1Qxz,=*]ʊUۋ^6j71Cp5+.wJ)\{abB5Zn=xɦV^3WGm+4%=JإpQ]Ks[r+.;V*ƕy0A@RO@Zgrm$QE63ˏSlbxXYipdZ oN@=:1ۨbB|<5ypmOr'_?Q~ȣY鐟>qv/1՗%JgBϜI;ggBir<6? ⨯Ef;vz2ˆw]e0}Ch۪>_^ZkX4^df%oܙF:ר?}b҃OyoSHǏ?yӓf?5]Z Z1~TS*2ӷM,osgi`1"4f][w(歆ܣu}tL>qùPD< hj6_4(_췸]V9`yxtz1Dx ۇFlCiqKhTO'^cTȦxA=|Ԗa?8uHXm҂AJFCA*95/xt4" N< hm˞MFo=>%8Z⨼)FX5;+we۠B pH.Vs@Pע SˤP؟!x/~}@3cA3xW(|? 63m0E3>jj(`{^]5PpAEy}&dG/0g5㼒1GU7ZI^@)UřCf{"ia?չ.8dQpPJ TA I(bM&I ۩G腮x$BWZpHZ3I.8F$JW˜^G' Oɘ$Xcx>t҃Rrm hV 1D)8}޿wQ ]EOGNy /h\r. " | /gAZ '5G@ND }~tV\UݱOt*tnp_$L8Ӷy4:Ք 1HW{V}*'ދC*`G.>b2VBX+90WQp'гL?{ OJe H Zlj$JM I,.-c%Cyz knفEP+ڄM4smgd20Fi~lKg^C M KMfYDS)J;9RW\4&y{垓S1*& C8.1tEE.8ʱ3U 2NMֲ%B6qҠ,A '\W5l|y룵w 9~j@/i&}bPTVQ@HZǧ6_3g 0|cJk,l5Oc Yڄ hǘ"c&*ƀCb xٯ֫}6j/XJ͙)}v*T2z _ߥ/m颴/wGo6?7trí$2D|Bn/0iz 'ncu~@ɥq+=sN.s{&(/r(] I1(ې`BK*'E$j0&!cA$W59K|]z{{|Z!A_B䏢HZ7]pꪆPf)C(iX-fvh`笃*EJQt@4;R7e1tZfdJio~ |d@*vYWOa۶j<K1gIm2w ߳2 wt vO^W=~o9V޼ZOpk-*G0bS`šPyy@?4v0":<~֫8}ÒݧxU֚PsR*ԜRHB8zTwOm\Rwș3EyYQ !.%֞Ug)ڋC^crYz]v8x<ҲYIeУ)qTЧzYҪOkyU\n#Ms ^y)ߦ0k}>Ѥ,Y. }KQ Ui*ߦH#uK-:ݶNt36luP ^cE! `DhXT" !l5>q X&' sw- *o=mszPj8v4-Ow3[]T4+cHƦ S@-<'2Jwc:Bvn ͷz埶]p4;@G 49R= J*-R|pbmC0J+ Qp1ݯDw!;.twvǐWOswzq:*3Pk.eFKXp$A4]pʅB d茠^e*ZR别(lJ٫{<+!RuDpp'DrB7*+k3*u9t1I.8jVvnhZz!:MYI'A`K"$-r\nU 3x6@N2b2#Q#,n;E'_x]N6(cҏmX6x:-G$m+:(襰c7 x q]yL`iKC)2%d>ー. ̙QY.=U5̝q$5-Zo^_8xKޯoAkoySqDu)ɬBi)VY}_ FCHUegEzJ7%87p0So&.J_JEtV@т/0ݽwQꛝ. ^}iu`BH>`K0Y^m4%;7ʠF{4·%_{4wu5ޮۈfϛև)T[p/EZ:]g?|50X$8w [v*UJ2CCاK 0é&nof[>Ht`/4z+N\F"U-RЫS/Nؗ p4gjiI MXPlya2'[ء_Y 14s(BtE Z8%)Y}rіK1fڌ֨gݗ܀#,3rۻJ'!&{zj) pLH-8@KX-.61{}-XJHb8nAOfy2dGm0X<S+~/J^8[iWmϔ(Fwd]S+s$}"EI;=S]s動Ks+GօF"g=Ү7c,+*cY%2p"0G ]n]ouA:(ᖓ;_&޹7uûZKJ6N,|OjC-Ud0 <=y2dIBJߣHhMuǹf^mci&,蚫>끇o:ٟ`z_iy;ho݌F}|=_p̏#p=+ۂ1/}ķo;Y7p V;0NQ2:D?R"U^~Ke\GCOcwc|nFCAH~0 lr_ 5)ZX+E@纐QP*.t+&{gYOnM>e+ /[] ?;X(71{ǬN/TȚ)l|Q"&0U􋉚̧'ۺ\y@kq)u2XZ wNw۝q'myetzʠoY%Uzb`t=&g,7]ݭ/' #`@;r[*(gϱ*|W8^ղ$ "O-˳W_Oa0Lcu)v}SrrSĩtJ,\Zq<ԮfO𦦍=O6$“J`NYLU;&HsllϡUP擨op8!n6D`-yB *djYjK kyl:߮3&(<Σ3 o>lx`M bͶ(D?1~-^y3Aަ7d6[7 *m6 mS笘~&Ex ϪjMx00׌Ř&[)4˓ۜ.O?@À򛨖;c:WB#LZ6C2mi;?/&a.8&͠za/pY& & k9 %X)~vêоFO-nH6_Cgdt =Gr<'tDjQ3/}Ṽ[qv^A^0.An,2<#t@ c_يp=ؿ)q@W1J[_}UOj+Ude.]׬jo&_h />-ۗX Xad?\/).q^ K7ĂévCPIftfeJ0{፼T -(Ԥͼ`s'9eZ ^~-/3nȂ#Ȗv]&\~&Ó7"LVɜ=tGO-]-Ƿ㧻k(X@?(d0pX\BpdpspBR'mj0~>Cs,:12pEp0S86J=nGr1Co_7_zu,Z4(dHTY^kɘ2W%C( B ip9BDbC!aIbʬ˅%nG$nKed̛ɬJ\%X fDGq L M(bSI-E @Kq$VX'TiՒnZ$weDʩ)}Ck7Cv^h^*sCD1BPS8! gtbPRO*u;5dvdH*]KƼʵY3nN z- il2(\8Aغإ0G 9Z-auT|qx;+/|f~Ka !0lAj&Ka\9Ȥ>swJ߬BRcƝqDe髠 >opÏnd=X) QLQTv@Bk;_N?foOg~٢vޥnzoVm;x7yv <7ˌ\ǞM\59|99J" wV L ⩇~>pQ<_/],H%'[WPc.[Pqh BzqfV?dC;fBTl]+GrðoA邎1=cSk!$v>%| #WN,E-]@V *@F_|a\QC5Pv>h_4m1_ek<* [e/ч;.(Ϯ}Q(UGBAJWt\r;ZBp Z5d2^I*>MC\ƈ6'm@#$2fIxWmٿi_@\<{s2?:}ߞT%;LimI,XeT$11.2)3.E \dSQקY7Y5ZOuc̕\6CaL6Cx\#Q;1uȘ9#-`p2FkAa,(Cҩ5;{;3j#N v45IiBknr8OootE-Wm.vN(n,'q&)Kb2A%Qc4N04EnM($\3j6@1$yR[g\έ~ )"Iu!*e8MAbJJ)%4A"v8SFAcf]㒳ǙԎx~nT}1NbQJW M7 ET&hݜhE *&jH Ie9&3!rF1c!ĂKo1\tjkxP FR =~?l bZ MJr6q-Yʙ>BB2(1&̵Jaqj hI@&E)Uqb$I̵#8A.䋓PJD$NሙTXc0MD,sC@B2D tsÐa )Rr9!5lAjVҧpFf`؏7(|@p+*^뤵-)0`j@GH mtĴLj`H *km_0pfWy?-$ I)!gk#H9u|=CJ2e[6)Z#"̙syLdd#FmhR0 "1j1ѰRR0pFI|Km7cb=g~'xZM`$8Ŝ!猎$y$A>ܧmRh9xM(|DoڶO8f9.gLCD P$ve;5huupL :zi_ 0qg}ew>q8"Q14Ha1U &|{߿wâހ|Fc+(! #$51RbLmaYls'XmvXAT2z])o,Њ~I<9\qAQ/ݠ,OF$"nߛp \0g Qr̪yGX ) pƝkO>fE]vn-Y%j;k+82"sZ" jXE&C07nXr30PQ+iIE$фI6Q7@ )!5!LfW;M` κWwoWt݌>Z1z)"\ $5Yϱzk(<i[irw ƕPv`RD{T##Ӿ#cIJ_44p'iK$}):.B5j)wkАCJi)tڇV` @BuJ#v( nMKդ_Ҡ4(mPzh BsomX9n?@rfo9zFOJUy5zɑ)v;55/"VsćX'ka1w]a#| {[<6G>@i ܹs05i#*q6}>]g^ XD,BƷn[:y;l|L* %vDQ ;6}V*1ϛaqӊSuNj̥Z%H#8~ 9`U>/7T3*A~. uyeDuՕB{>&ZID+)S LAbsUt81ʫbN TugA\[/ k<[7M[WB:.[^g}5\LSWURnUIPxPfOկXæ``qV ~|;툓b kByu ]%Գ&δ, 0L|= Sxpx?VN`=kŇ_1M?VdzޚJI*'Ϯ+`n^ˏ^'C]\KNS7?%{ZC*%2C{\[ɾJ^SU .$~4WEw#ZRkDNٵۼn&\e=<%xTQE|t-*kt29RhG߀nBAџtJgݑ%5<+ -18apta2'u'xJK \آIyCU0_Pz"Xfa3]ʻ,]]KXlsh< k߭oo-4+xZNBs1ՕMljz Ք90el[F_fÚ#5hzQ8\IC}pϤik tcZI<=D`2A}G^AG|GXjw)FdKZiWG*>p Io0x zw&\y承f,[f#*50o3SeOe_$8\˗|)*xHK>ܳ|J X*6L4JyO&˸4.ge\ϳ Mʂz"Z.De* ǛUvLt CWz?BWO2*0JYb-^:;dÃy#-fI"#(R{-%6YA턲T3" ؒ[C7n)Boz_0ypBAj}]sCB7Q0 l$Nާ.AjôQj1Z؉_f+p̼p>ނ8up qK㙧jT&4KSMia,ܘa*mdm|m(v(,~c(A[[0_[^|]SBXyp@~̆S;8[)FQ|AfG7UDxokwFqe}ھdo|y9v8Q V- /yz1Og:\E߃섞πbJ&N0B뷆ecI a~ ,@Ml\6ް |4 h.i5Фp'/ {'lh@omAktO"3paFTlF̕l lER҆ \v8lj[ZZ$v5EUj>&sѭ+1V$+ (b*oE.(' y%tA9_\ {ժcJIa|O3gM\7P}x"Xe}öYk+9hG";Ԓa1i o#>lHl/{H04FCfjc!AZƄ1JLu"bZr`&m`~Y EBxAEB1n7x0K,FzH)$Smėߺ8[.bexϼ; Фüx+ͼ"H|Q[Vp_jb'rVcoV~&-O8kC$cpAjWǟ}/6I1D>)x7`zkq_E9#@EۢMG H*qۓLrヌl˶4I35:Dqxx;y.b2QG#OUJHj$% 7׵+ Cj7dwi9]\>fvv^<׋/6Q⿟OǾr@'qʂ&'(Fh,y[ӿ-Iq᣹l-rQ菢'_7/M Yo^hGd}~ziD $[^t,#I&pGL2ݵEv/BEr=(\Tȧd2 qg^l&, Sˌkϼ.6):Mw6/·*f(A,܌ }Y0U|%,oWd8߄nf_hrOfҊ+(AG_G3[its˹uv:0OTk}q~S>+9iWϖbx%-^>hwb]q:^{Sɞ"ΐ̖mTaߵ3rȫzz?_ǟˈ"?rq=\p~yY7ޯ]RgzZl"(UA^Ÿ^<@X Dsgrj'h/7qR!{3VF~*ֽuOG{V1 ӘION):]`c󛃑,^լH=CI *) psw~+5\LXAܧ8-xJ#pFM䑦fIIԤ\m6StV6o!o`O"}B.84PƨE`_Z؟ڌt>qI"Lւ"IM.zcҊ1eޟ[͔%IpH 3FZ]-owMUy3seiVk+c4=*sRB4[sXhw$ƣr; 0n2YcLFPKnRht8K >,'v&;$mHPgz($C{u?ʍHr;3 =Oc sPp~FNQ!aZ6V8}Z_hy*۳Doa򃝿Z[T<v!LAς7f=;3xcS;gZl}!I3r>zd@Ee>ݍż 2#$q3d vHv'TvcVS"ZhcFrk"Z/;Y?5[#У&՗v9Ϥ[\f061S q+a🳭h;{6p f`Q6~~,צbkV>5LkL{_sy!X!S.u^͵O)[YDD-WwWV+)n*zYUuU$haa[B\8A€ !gVOK;:,0E`WAi{]g4mh 0r JiΩФ2hGs9RfakX묷$b]}zˬ`^c-,=k%KIx!3!(Ï*tNɼޞ@KcJP'SKpN{MYS#@S;CRDXB`XFrJ zZn59%g=#Xς yGXxqM2x5/"[olY0Jco$֚ eR2S41 I$hb3ODo׋PخD/Oe"]|Ǫŗ-Lti&%cYB3c zj@)((UI'8(<y8!'<<'TNP1ndN05bȩ8Dm P*ȐC52C?uq*ZIQEQ"UQɗ(N*f!E*F]XjxYw9Jq(k_tbyrщCvC,$=tw/T|\D!=Үlvq>-ƃ;n޷ϞEű `*NID99}D1%`C vVSG琱> KPm\ cy+K<֐2puμ)1Zܟu?}9 tiߓv&OTM{vK7{+ݳܰt&d;0qE:NB9S<#s:s:s:s:s:s:s:jPN>,@W{5_'U<5JGK$؎hZt@ O?^wX +,y)L|6E= *ϠNPg☗Ԏj$#]5!*u?=KIw9l6_}gi (dΘ1Y뗅vsȞP#o4qeZJqw6'e~?oelA3 rX4SHMÇl0f;Xqzڍ$F :WNUV2z l!*usolĊ栞u9hcD>{?Efkt"f .#2=Da"&5_/6*#xO.u,&57==>ڿNR[736,\w"zLkB}hk9pפҭ&ds@Y 4J*9XS5xW EoS_WL#?h5M59~zmJ B~+ќ 1;~](]ngĻpo?釱>\l(u7z5*hԭDͰv= r: 18j/-UHV9sLMY~ --ƠXޝ+p앍9^k~ǔu]z/ĵC[╛]-"ZMpqN~ڣvlO؄ Ő28e^d2lfT6)uԼӊ5Dr`$"7Aԕ_υN!畀{1=Sqc:Ć)ՈV+Q9Tq&DC"q *BúT+q:=v4wT[&$> ""l@èU :IŻ߸I:ɕP%AƂHa%L6$OF]IرHNl´Î$-E\ba&yԖ)_ȜV+ܨPY-l$`:NORB8e TR>K-Hv=UƘ$A?e!9kGaWh1Mqꔪs'~sjȊdp9@ ;wRs)fJ04!j JQ!%Fo`u,T+>Yae Ѳv3XcLZʖؑ-);\ Zޓ)u+PĹ9/2( o'Jj~R IG`WhjЅ|6h5{v#y L%g(Fͽ&TQ?M*ޕFr 0WZ+n@+S\2;D8@rCAm-)RIi;R\Y옛{$Bq7S$UȘBBX(SR#k8^D LR>IŻRW].+pS>kG|2K(hi 1'u!dt!sf@DpZe ;QƒytsS>*1fm3d`Qsl<`.r !I$`3f,کv厖Rm%&TL- bk=jM~;$y n[7؁T w8c{q$.ˡOsӜg"UֳU]}Z(h$ ӇxD&Nx)n "Pvs@~䃬OX jBo^/<"Pq$XTErQ[uI2_ T:85UZ!\Kt,N@st^ӡ7 qB#C!AΊ]y^ΕQo0j-яV5Z>(>EG*"@sYpqʏ> ΐM75<\ ]Σ{>՞AsYE=~uoy{ύshFEpZ4|.e9Fj!>vZù!HsӍmT5L4@ e%}_S]L}SɌ*CXǼ]T$zP6V߆ w ]4܀$N0(H8yv'L%FYxp8h/Vj,PD/5(Rp^.3 ُ QV1p&W]⁋V 3C8\vߙ]v]]a:Ym~4ZcP'D*."y\Oo>Bj!k0K\ws|*jO; (XdG~G'J nb 7@w.. M"?HG.B>HXA`%TCN'VEj nB+䀕B)LF[0*zIoίO9]b0T?c!d2`\c t 9oH\3~sЧ0>c0vo캪NŤ H`tocZJo `]ƀ3 ; $EAV@:BBU g,#4+d(ӆB( Ё(n+ ^"}6.?zAe)1Yл(|o'IJfX0} [jk<Tz?%g#to4|M/1Pʐ}:Ч1u5xp>(ִ4WC6BPІ"X?]$0V'x>wӀC%3Z,6aGs*%C g,忣GflV\8e4.V% 2*NFQ[.ʟ0"K/RC1%!uN\1R5OO)-Z9tM(0,^A7rz \Pe?#rE-瘫b >_V\+Ui2(ǒ}Oӏuǟα´&X|\(|ů&dcw\)p En1J )vs>YY1nR~\."e[E;ɢ)<\Wf(H/Ys;5)8WW Ū>?=Kt J<$\JƵ" ;)མqϳ(d v4fEGş6rX%xo=<{>B] ! ЛwEnK[ZwICw_PsZOjZ.Ż<]H[/8*s>3!VZ9؄aQ*kq">Mgj`"|.sU=5;vfD ШY8c o){B[qL>zz// oiT;+"Jfljtmn3<B=z`eGɚ"#iJm7-7m9p҄0ܕ _ @>庪 o}w!KSw|&wEƹ᷇#4ȴQ'p6 <^LLv &O T0UyD>z@s/࿻lq"S"cfSߪVc"T_B6J c=C1x" >o" d% Ш\_FK ٓ2&"6ԼmLjP! gC$xƘxl#XnTRC0y/zpxC[XA|f`4ꌹ?ir60][LQ:jmA:ψŎ(Uif|u,6ؼ[mײwkDĘ^rRrW Xr\| h>(X́:Y{ڱ04+ E*-@sMu9?2k7"sX+QL4V#cEʨHo}> #Me [Aޥ[Nx\JQYSK`h& %Pp ە$Xhhd~mV}Թ'Jw0NW&Lkq)t$a5n *,|ciK3Oowm2jxMO\(x%g3n,x2 Kyw+gE˦zD.ĉ<@bo2c\z{D&㮷V XlWQK #>s#e.WfDN e!TWe(VRsN4zGd9X8 J\a,u||9*8Q0"={,7{2`U.|.S ̟~r?VC*t3A 1g{͇Z0YAEl fccJU^&?mo~v-lw>\ۏ;ۊ*/,ñk^mc+2vqs{Qc&4=xF_hQ14+0* 2\\g4_[b|pȀk}8`iy#8('`Q*A3(h>1u+LLfI~K߬m&lG!"ڶKuH8Dl%k$WLW%RG0y #*c]UX5 ]Ө ]׮ٯ]>֑9#zA8o̟ =݈<@fUݕ5:~{|v/~ZѠV0/:4p#l4w#}'A3Y3cD[4Xb2ȈF<)qT!xT4@r0,WeǎPHgbȮ*l1'p{ii:|.37wC(V~r(voUq^ ק͘4f9m1Ua eƦy;-5k;%l|Rw>:9KKnE r}c:GU ~xJjxb?{6ѼN(-&݄Z_ax и$k|*Y>kV+نe9#g‰!Uy=RqJHMß!-bf +jk4 .}@3Y2z6 Ba}n,#F+Vchv?={]vpbGor~F}m~v?W*F]/Lr++[Xik^wzLn)@iQҒs-;QkTv]2xȅRmRb4@?)e)Wn܈O"JgoZ F:=$ qƋcZbm׫z&`ejiJ@R]%('5iDRkM e]ZaΤm&k=O4%Gd/#" pF8IiP[iŝ-('L`2Qjp}ű'd'\| hT\oNj zEh"4M3VNxBȾ t*`4@[B׽28ɓIUKSPiLEmԴ 9h_#O4z#5a@3YQ>avt0cN/~RA𸍊@Z8a8ch>(X@sٜ11-hO_ĒJU&rsLUha8/]hg(.EBFNS+#s&kE~f2dqd\&r˿#B>h[Y[~Ng)YӊRrAaJnRz2 `mo̖zG̍vLJ TIkL35:kYzګTt܃][(vs>AG&ECDm1|`i;Lyf{flw.#4F.W}^>r-,A-!>T 'A Nj5JYq}m_Iaa" z/)w2 DL8Y{E (OޫY7}]L+i2GK."g%ٱKЎ7*(D5Ȼm#pH_zIzȇ6A>\+(uk;" %[_eZtc5#3Ù4d a(S֣p豴V\>3~hA*ъ0LLIX$A4itYN ̾+fs fd;,%D M%u=ӔBumL|;F+~cݭ4_^WtwX8]\8H}mˈq f5e+ibҔZV8B8'@}v> /܃s#&$p L*YB]t<αq'{{tZU<8_8W/LsQ`^[Ӌ/7z, 0<c@p.*+"#!i&bjcXcs'1{#vX87If*8RO, `;3pY)1|_PQ4[( X0_%60FLabȴW(, B {HtX8qڠ;x +C#OhD_8=!"% xr$\:$D;Xd2@N,fxH5&CA6 2pg¸$ Eg g]D ?"/S;ʡ; *qxT*|<:9TZx²`(8si%8f,B<))9b`˟I'=>42 Flq¤+P22ȓZ֞oAt>.iء%3@tf6+Xy6#bJbBQv *u`\ Γ՗e;IKD % 3K&]ad:=0De065imwT#T 07\2ۈC .iХFX 6? E9)=}V~/TRJ,-lv^MI -Ȫ֬Y'3\6ݖyR9H{[1iiJ7+̐̈Nw|o'?.ՇzImf@yΫ,h+X3ZqML][$I:uvzBF.GrY2\(=ڔ5ZQ5WkEh5q4k#ŌX E|})_[v(=t2*U".j,;>ܯ;Uz3̈P]:[eىN@Jd+٩m0BwkGj:ӭau:\ө1k-1a+dXuc"Us4޷6sڢOS/A:GκV){RqJ?"oCT (":&!0Di)dÈ4.Zkt|pBP`P k(6(bfwia'e bf6~X,?^^[D#d[ܽ//1C!r0#zB_3'$C bwWYaPmc.BWԁlԆQ E1:R< Ff_!cBDprvv*#ASd7RUZV11ؠe51sUu\ll.0 X :iPEiD#z,iJcE]{q1v9Ve9Jz`ӹYnsnEFNCtgHynZ)7>ͅ,KSC<*( =[gk:hZ*tBy"7JEpk!ߟwD|x.߽y#kʜ \85>!1-ƆG׸to!v'0]B&̫ģy[}2%#ERDkIIdML(Àܕz3wK _s!Fgw-זlqV4Οk1~4O + I EX%LBZf2eLP̈-e!u{L~>rYQ{/A:<10P~fZg(+\vaD'KqMQ)NbyeNǹ-:nXp,@3Wf|>5Ѐ ϛslf-'x5Di0ynݗ藚7uQRQ*QCoPvTTf/I U۳ⶡf jIj_>/U>P."CE$66 2cI3e *AơVJ%c R h} ᝖,{* r7P gHm;`'de3ϱTztnQq0ǃ8>Q_8%== ?wl8)7ETڡߍZvhB5 OD׍cy)ͦ.Hܢ/rkh4ҟJLصXďOVԛ _wcj쓒܆`YBd-5|zȣƷllJ؍Y!\-sf܃x|Yaf+jiBE=`]lhE;^SCӵfFys]:xFQS S9n8U]H̱wp@=QT(P%{~T{*c2XGqD uݴ$uHZ1g: qZQo2L 1V4ѨigK)bq`6E-'FhTCh5Q4 xl8H9)K:|XRR.`C F+2_s!j(cJUd,޽Cp{PpR6'w34ШБ,Y J01IU)WQIQ[hIIXEkV1hC2 =VZj RB42s@* Թp/ @$^dŠ3fkBx xn0(amj>kl"c7p]>Q^ŋ]T޽m35I@)ɴMmK8 aݏqr\JM cіK9QN>|Oal5Q5֑ ws(kQkCwK縂]Tk_(-^e?bF33aFyWx7sT=H:Q]_e$9kca+7#ޘ=~ك!FgԕTtԞr>ɦ3K8(/gʗO'-JCa>%I;iUk8A/qJ FlXL±Oif[nIUD"C]>xͯPRoIƞAih.sd7!#t?&P;OV_nǔOao\{\|ru%qƳWVѰe)毼dk_[QF< /8x ,׊8C81%Q1I4lHLOiE_E+oZښ[s[y Oi,bJRN);^+ Z[}sKBf)Q"ڀ`V20) (N\?u_Hd麱Gz٤i A*2!LWTRNk\?- xAߺV#w]/'TƊ½qo϶ $CkiU-?Y|wc$]M_jM=t/ɴ0'lRAopWg~a |ē\0"eH gbS E`XX=U=Ugr5%^R8ޕ( U#2$nDŽR$h6×^8%_c==:@|t9`dcf~Q Yoap .>)L0+oǬ#KC(:8iDwMo.s"JysYx% &>gk.E RΤ5FG.0b#ʡ.m3[0d֤z7cCR3-f(r$ǔW MbbWoLE drfSDcޟ,:U*Mܞ2ZP$$TsJ+Q 뽠)7Iyf+\詙I8J{=BL#rwl3ȜSj!;W@A|%58 Gn[l nƓbq߹:S(H|ir˧uGLdW #λot+هc do{EW=\}{ӕEe&J:;Ť8IᾮXL@J ;~.ǀeTA0In5|cYwlx%_a] >b}ߍaumU׃T#xrH5ƫ&$~7zs1vY}_\n}rM*w= aqT.E%(`U`p%*:5;}j˼ʡT9(|CC.s\U9eicm*pXoq0C[ -hK܌ 3xnm|-{?7eȐvRԯn#C_RT mu~e׽_yfd JXDT7| .u(<{93!+dqM6uP94볯zcE)#X [4.U!!^x<'h)&- }{ۦPTOl17v6T.LCf`i0T;;@/r00ry{y`'WZYz-G3-qG&L9 8(2jo(Ļ`TWW[@{K1fѯÛwor\W4i޽&py`I{Ķd9uL10SJ |!t#Iuy? @B@ʗ&#Т\y< >QlUx5)9~x]Mo.|aw3ܡti(g{BÁD (Ȋ| 6crLBU^E܌EB -"%i l8*#r M(Qz+N (Iav=g aAHWnx*Lz+!_{=/W]'s.ݷX*MMJ  wZ:5dtw}SAѳpvE8dpmq *Dp 1aAxP,[ilMb7MoɨX/-~ip~D.4:SqU|5!]5" 88wNd.z?޿ܶ:M 4 9h`‌u.YoU[\:$0h͠7jT \I=w_?İmHpۏ y )_7BC[տLj~ep2[6_%_';ǽt  ]]<$.7_ :G hg\ωN,xrVlZ?s,./z{9ˤŞnbjIR!D ڀ-:IE);c:OZ# Z}n>cyq{Wa,|CjEzX7_Í:.Ň|NQ5r/,uldh-9)ri r#ٟGoy2'p{b!>]g˸څ&u3jw*'hepvL.^;a6r:h8믱!~H܏ZUSv jMU4<P寣aKzr?TϻASi;S9j|U-1L{V[/ϪRf,f f>,#pü]7Ծ9=nԟO ha/tt2Gh΍]P-8qAWwO61$ex^ ?Rw-qf'r+'1]jYy@IQ)ގ|;|ȬuKBY@\F`"V%A p6kXϤ-˖ ޞz:CYׁk:3nW}[Q ٿܷ}[]DMcBʘNB\h'2{ZDS?URaՅvyQ'ݲY h7[*m>MIhNss[MvF,'㑇7ǷѴצ.5 }mmZ+b}?xwW)Rq棈Q"j\{^xcyhv1۫u$xRߏ)^8?|Čjj$\}Ɨ!dCk'!Bê%FȨUC80=y) Ӕ)ư]w5}.Ѡ*~9zg>-1fcq`^9 FG*3_aW >k-lq4l-V `ݣᠽ|.lYܘ_u6NsDٺ*iLP +% dh 6`# 5xP}3ZA{Lzև ˵G˶JGH*q{r.,`ةdJd{iFV^Ȝ*G{֜=rr5o.-JضYHH,Q%rD БL,;ټss0ڂc!m0ה|HXbR L jXzڮݏ .ZǓ/P=j}dlpG%YQm 3F9۳Sq,rc2wiKY. )`Eq-88=< pdM~\&,<W"#'c=ǀ*]|'=n銚}Efb3r/ȉrgPt{$lK92K@>g~*;2g7U]_CZCju ?n+Nn8~16]h9s+}\U,Bx\i}pVHrffqZYrK%/נ5_D|eg E.h$Z%os)~.}=Dv8?UAezDҶ}.y,c9'{/Y^!1d q5gu Ɲgfghz1=+Y0}ͷ\^@Iݬ hΆ+DOѭ#PMd p$Pޡ J u=\p.*Yg@UNXypM+g#LϵE?yWY`pQpkG0+ kib^wwzJ,GwcNbKޕ 8Nlu씖A\`Tc0NHz Qsv.'}d8 _<p:׃!Y=/p"v/p` |ř5GTQDs>99DJ1vLvnVERN #aY$q,<֡v =1 = +6sU< C`HIU@̬A fg _d2 aL`q k$qpFZ`Nh CO:}đJa0@K{6*E *8:&G'Naâ"%AR`Cy OPWG10[5t@0yְg`Al1;#4/X$zG fVBDCѠ-8:&T\!^J -8:&oASV @ *()#h"K@`PcGLL82QlV͠}rL "5(hڂchz ˍ*Ps@Z& >c}Ugi8F$ u* b)/-shpd :=>X nf/O/Pv%ia- ,$~;CĠH$œ:R @wL v|- & F !(SAֶe@cj&EIUCnMJsmEuiNQP_'F5^c(n|$kkU{K*:!SXýD(CS} Vp(Ye@:vJƉPyÃ?vrk}`NsISXH'}`8։8c)g?NA&PBNQ䰫֫:%;z6=pS-g8p֍AUʢz|ZF 9 q YS+#x/UCtO{Tq&VbP2NV 8&g/m]Z!9A1/A&`j6 R?piqaQG)`WQ/X7ަ8:&:tyʍ5`<*{I Hii1 H18*8A00FAgڈcjS'i8ŴwSWpX8߀vB<9JMMO Sz>*Kp;&?~dtwG vFtX`3%{Vq-G.& 5(E[O[Ufc-k%NXk`mA1yݢuAy3|T9BhPMrLu,TigMC J3~t9*(7F2B H4+[%'I qB|)cO{ޣZ8шks_z0g&<ڂ[0̏Iأb @G1?{۸_VwdM֎LiHʞEUS˲hQ6I!"[쮪)JzF=I" Tyc"l?o{5ow% !N>v Nb^N;zQVI;KnH`hp_`Sݤ,X`(Xx#Q9Th:`W>C9&!aeLSYxa-8h` Vò $c68 D@6{+\zС<./f~6@+jg(Ս߻Tm+>vx`{Ƃ]EC>唪]C"E !Rˁ">D1&cCtPo >|jwLq o,\!w\tnGɊKA?(j|0YpS2#SF,ú/1w>zvDLigpY1t1AuiX8+LS8FngHrsPڦUA^~(Yܞ@ "^H0]3* wLrMIв0th򄳺F:Jj jh(1dHa',Oig\M/&ǔ5jcr63rH!~>b#M5>zF>8ۺLʥPK:EB9]/>zFN@wuڎF:ExXbFC)夬Ven9gT3[Hܸ\Zs74=m̲aocCH OĴ:܉ittbOBIyܜRxJ[uʎ/KkoN[ONgSF՟v3jDqeRGlCkgcʾ'G]8%@8<ݹ#qTGQ=.ohugCC=#G*bWBdJ#Z }>j'NԁT}v˛1^t <53r:ر>yL*,[=j@b)3+NdJ\H0m-#!k]X,5[~?Jns=작OB*1>(Qt%nZ yx3v9wr!%T VblPCƣ53r: BOTexvGapm+ɞС$tsªa̿I߼͡I^dI֞|e6-\Mt07ZͿb1,&>DKPL`qu\Be2 !~,|(/lϺUi64z72E,$EN>ԤG~NԺ MMYTlx7^BW 6ADdnMB绘M52l^x ;+Mri?eQ2[T`lE֎Hyd[=oM/FW.:F`i9j~j<yٵ|ThφK.wq_l5-^  );Q)p39Uz!\!D!e!x0k5f,`ZFL&Z i̧my1 ~9/Dn0DۘzV."WhZrVn$@o݃!U?3X*Knuzl )"_(xl+Ww0?J~'?%ZbƊ[&}unRkVkEgWMqe{487+=q`HjXF!׺,$S{+%R :nTc|f[y6i"d;ɾX" Xj ïA*7klgqׯϿׄW>RLSNLmm7 ,:/E9%wmFV-R!-_֎}8rٌGgdU42a2'ZK(Ir yIEt$LEt!"'U2`!<ˁ8,r!cf*(Su46$'H еXG, lyc_iҵ: H[XA`IrY$53Gc]XæΜv>9UsXL ]We^Ŗg ϚJZWǯo)@bpP =| +UF3(;Z:paFR>zn(}N30jR}،+;9NP ގ~@WU5/ߎǘ Z2:UL<roGU?`"E3%[f/CB>}G?򚌗! )r|\?8B,K3,(eFF- oFc/^R߽m |wp1zfX?>p"%Bu Alנ|D)e +\^Ggͱ}F`90=ZGOQ_GOKsKWkFS Q3w[ƞ+& 2kAa61 1a,퉀" N;#:zp!ܼoM>ӴEJ'0\0A۠FH"I`cјbiި9pZx cEodM TPi*wl ea iN9l@)f(\hd9],F B6VtbkUp֒_|sD%(: g#1xxjpҀh(OW dΜˋy|qLqdDHSO <ҘJ_$HD Ry) VaNN+$x%h]J@G;b Ygig5_}ج&cx p]I! "LeIkpŌz9@,pJP'ň!;"F9xBHE0{A{lp `Y ڸՙK"~Nk>1|FZ,F@ZP-F4ւ 3FP*΂m`` ʟG' bLxFYzuh"|Lր#kyu@f_3pdҴ@|FM۠1R!RpJ\hT"19ev#c[`˗|Sv|t|mC#D*(X#Q봛n &zQ`!FÀ" iq'zR1D.^fu⟻zԨMAWtvytr؅5nu9G}UEu֨vT;OV8$΁# QG"`"RSFDD *"EV 1ΎvUWjZtbV JW͵n&d\Mh7@5A_{ff/adٙo|"KJxdc6"k/%%X .XZwDðq I 7U)sW-g ~h5kA #JgǫT~0&[6.|rϜ3όyf1g}L10Sm y-WL $V{uv*=4Tjf: ;Ҷ]ȸӫFg)Y=- }EA <-=!;ykי=i+Nph@PJ`ƷOYEK2Y$85ZfAM;ݚJ[: l}=ZanC(_UOcގ6N:qgK ^*O/:urLy#Zl,f,OFo6#NNowV}sخHuם[V൙Dlg;ZSŞ_]2@=!N9 @H_ {;%ݻ9Hq0bcA/d >AWD 8{}.۬JrVa/zuTv,*| x^eQg[Он[n@j_ ¥ ;.WnMDnׂz׹hI5/Sb,CëAtgmHBC%$X `ed+vf[M(1ib0تn֫U &2NT8fqMVqc_/׍xN@.MiGڕt: xo}jTZ  M)Z4C4ư,?1lf#AY3e5siUUO]PI#?]\W~dWiu㍥Ͼ3U])svP9CÈv)kcűv|^×F4GɛX3e:.U2?g)up1%ؾĩ8/'"jq8g]9 >ˀ/t{]VI_kWMBu],GTpcR\wu^]aS)q ut]tr5 M=Y0֭ZzzS4Rm*#!ع̤9z|E,y"] 6 pXGK:;;k"dԺ]vooop].ぉJ @ W?wŽs! ),!iJh~$PvG͒+j^S2huRwouR5Ke@caDXL&xw;2cs [s]YJy/YF[hEϤiiaU!йW"$[;izC,5A&0MoHKm)zze~w=%Fhﰣv @:F#Qe&J, %>'͓z5seCȇi>3 6 ^FGzNytБU}My/`rNSCn?&ֽp"1cJ22ud"@l:*n6m!|WB-Vιp@vP?wH.N`4s ( C.P2 1mcծq9{6%b[<1$kC?'m9C 4kݶw4T7G kCUI֑>ADxP&x|ZtC1KԁLĝPUm{gB$]qP4bJ7ttp:T!^s֕ 'PƐ)Wlթ:@9.( y%4we1t?9sfgmCS(稣P3f bޓo=onh;CK|Z`TШDx!V}O&O{X'Nժ/XGC׶$bz&GC](g7>% ڷpAv8h'͊" !2āa <-rw W3Λ<ȇxU(e2,s2J2QUL{ׯ!70I܏}Jc1F13ɬ`)$$s y !nn4܇yҡMb9Ũ]LfsjPlb9k}l&O )=pt~+_hb.'oƷ[0!f/ї~LaX p;n3ht@f}~m37 \̦(aoI.^^M6ܱ|s,d|[~ݛIon}Ugp=swB:}pq5*Cz Y@SJx->cݼO,g ,->l:[0:,?  rnO ϋI%9߼ MgȆdDa:a\bSJWɅt?ME'Iߧi=Ri.uԔvʓ-9&&Y}X +l#^iobdf2xZr'/2ަRc 됻Rev^ו;d~~?ʏӏo c';*b=f~=;(PP L'7_mqnMéml!?Y5]YFOAt=KV9X<@pf2}?}H6_AV>?^ms#7 @5я-,7ט+^ahyGI2> .;hbgj ;ݵc03%oӱ_3%ňuL"xK[S3G ZV8h+f޵P܇OOCs7 qA#5n=R'4)CP0|f&nq(#߀Qjzu'b@)6SrPz(9&r6ȹM> Ȣ`;xZ)"V_"D4'*"2~ZHcHhvVYA{pҊn+eq3krD\aj9jw|g6ӠAO\0K#Q,76FL>Yx=ȷ{R ָeƗg#Z^n$X`T9F$㑅H>HԔ1тFQ4`pHnnCXdq}7??|m2Qq}(Xݳʌ6KŖү[cq1VYE 62( F刊`) NPB2mx9*Yc E5x+=6|* tgy;84{mc 30lP f~2 =}f<x>XHǞ "k/%%L|J7wŜ(aȐ̴5=͸gLmkoxLVuO}i3rA>| ʢC&7)yfZ5wYk<*mb@ c˜ $X$i":zy!9k=ӺDJ'0\0PҠmPV# $0Yf_W9fthW2N&UloU`ЬLK aP:v)EF E+ȥk=~)VONNdl&l6 uJl=#Q ls,sS0g SF+l(^`4H3$w݄$xȂ 2"5%x"1Wo:.D4PaWUA*[3b4+&-Ȉ&0hG"W `B@Ѵe9km9 嬐O fct] R3nCpAvڋ 4v(N9'h?4qA "Ц<\ u=1qWH4OHt)&bo5} !főLL-OΉ'}x:ο i) iAu Z$=`_z2Q n>$hEq,B~LQ{>Vu:"&!h5kD 1jlWb4 8Q ўxl%"mqJkY:ؗ_ՙɵ-!Y1$[ ٦7q^Fӫ~![ C2{ RsRH1mMRHѿKxnШh#Y1j1of@V; ïAM ߦ~oS?ȷ~+f}܃L=s {uE0FDI 3m~.y3 O>16݌:38H2NZ䙎'L!57Ifz~P^C 86{ OOK#CjTo+}&\+w3$5AZX,_vGEwo> KKB9M ˽)ncfbŅa{kL'GkQj$brK%bdFT=H ƜR؊]0HWC )ɤ&{anqVwT#9 3q2^X^¦hdQ0'ZK(Iz yIEt$LEt!"'U:`!^E.qVqL30e R  >aC M+wyXddLT-#3uS/IN Ij$f.umCj 6HZk @(Z,Eȳg'xw\! gy3q_IJ8fYY%R^;ZwCyT[||=/جY~WxI@`©vު@5:Rbhw)8FҎY | \jKȳ~ZNC S !h>Eͣv!S$&b$!9PF[ɃTD"7ذ2& L^u sE:SO7[;"E RrH@7( J3PJb,A"A$x{< .Lӊ|>LKX_QcڤO`b)6p[Y)$dH싪>sn7/UaS$ád+ۘ&q$K@;jX(W9ݥQ2FEa< Fʖ Wr}}0iT{1h0Ҙ*[` mX+H.H˝I4Kc؅pwjmN5S]I-Fƅga^̶ʼ%iX0F@%ξUt7nkwg qwٙxO!Ra-eQ>}E0D]i3uȔ* LPT J1uS:x:MVÀ |=!LrTapl6\t빓rwpzx( &u.3:4>XMY{SFTferٞOWj|](f̸I}Aӷ_ąz6q,O`.͌01[Q:gR>xq19?x FS1gs7YEY]تۋIy%0ߌ0xԓ{b|}O盺!ahs7DևO) pп;MLW:^7UR"C9Ͽc^}Ql4auNzN蹙*_KŗWq5z^\K01g~ h4^L+rT]tTS:{qz3@E~w/N~{ߝ`N^//2 .4r{!/ޠkTߴk]3k/GMNyCZW1"s_E>bG6@dY­|2T&e[F xC"# Y1f!',[h%Fѵ^}{9S|8;KA*cQ@@Z{bs{FpbiP~t0)TS1x"yos6sL: $m|  ~nMqP**pht@AΕ-v9秓@bR9c^z3ڎbfYӉ8gn3b^2H41ǐ/(cR;ĬWm#A25Y@!)Oc,-UqܛfTC-|A /A(l̇tl%M]ZOh7;1.yttq|!c|aC&k^"Ϡ| 1\;@9@K$u;IyNR^żn-%^ MD'Pck ,m Nh!&0 7{?r<Ә;H薲6vE<~Pd6jP9Á6NAy̡ H^)"R9ʄdו^A>h? z^Rfsm G^r$6$<>ϵ,wJp44axV©Bp 0Oƣ~,U>}ot@%@]Ak's׎'uD9hN9AZaLXv00ɿxM‚-lk|bFru˄aVβ*[3ʶ$]J;STdCޛmQhaDOku ozCgdıfkC9'SvP*C0m:1e^pr'ǃ 7)x5+J#̀3KEּKUzWdɤ91@a>^(Qr'M@i|cJ ZCQ%iTȾ~ @p+& :g V.L>P}4 +BmXQ&랬Xh" 3 gN N oep6ۧ+d0˻=~fgW4c{ s`N0yGᄊ]o{69tS3OO =ݶ5q0C'uHo=#x{xg҇v`yGfVێCtznW=6\$:T͸3` b!LcTR R1qY(QE—bz.1Bŝ"]O_ `hJfr_As^=xI6X{p:LC#Jzn Ax fh7yb]|`hImC3)#7MoV_/!kk~9cDKC-0uGu\qXt d<y,![nu '1bePO\gGק)=/$*8ʝ7\ 3f4j|@\^1Tk%i-k&u{|6߈ljdKg'! L%PME赇oz$&q$0+1,'8ydsi bƣfYwVc|R/IѷI(JqTQܦ:Z:2,"1rbrX+"D46Fs_-m8%#آ޲l>?0L[Ap|1RX1mFU &rNQ8^ҜHD oZ<Jjo#>HPcn_%ngiauY|n.&gu}n.5Kȧt4*k:@QAsglrD`p'!&X|y䌞 ٣eo/K_i $o}> Jm uW}sz / {`oBi ;V-0#+0g=aXHvgg=M/q~xc"'Jќ)`,Q?p2⛧oV&fҞwg'YeGq6Ng;>^nn,m?7-[rd8p0 a?l;RY-UЁ~.\"ͼ8g֨\AA!+}4+]+7N$Vh\ U=9IE!5!Xh-yF:H*0>-ogAˡپvb߱rs)Ӕ7 ^W*4xՂwOnF{‹ZVr.|3:Pӏ˄M05>Ά)u7p+1s? <Ē^Lrw0q/(I^?RS Вa(;-FJ}/i`ހW& :$I )Xi]֦dգ\Lu/9?$]ضI1c}`CS,8bj-+ N(K>+tM|*P|{5Nm3jR/& @/߽ɪe:e&]J.U.~&*_^8D0}p^iZyճ>w4C0cpװjBiǾ![ci ⋀2b?Lr'hp#$ ?}LwP$ %JSa @ٔ] ܯ(~},\n4'cz޼::+яGFZ/n&2Op+jA6 q|4(E Q8kQ8 c$JόZG|- T&%`9q:S4D-v{"0*Xhz~џ\ܒB4PR$WD#D1^!獍˽Rh &|%VI;K,ÐJb%H4HAX!$珑hbʞa#N8S&(A** ~4b"tʘ D9Ӓ`eM}(̸H`0KחRty \pd57؝DfTX` z HM^t{t  vOxovM#1$r {O@иg1#n\tI"9#**D" ^K>g|tW,<Q<`6VouyN7cXT!`"h0b`5G"VY*6֛ Tkjp+#JBJrC7\{=ZNR/_N4*rQ7jO0xB#̨̀\xADŒY S='i?{WF 1/}()@?x=L{1@c #OkR(*U$%/ XYʬ8*ikORE]! hUv83 ߞn<2e2:NRR29ʌ*W&)eQGb#18u' @ XPgxATYŠ բ3 nڕ;%䲲VF+&$f (KN0"| )I4U;RQޢk7¶ 88@11s$ 28.G,>bw0-i>;0i צsX %x(:A͏=Y)W¹c cj,,\z7~'Z("B[g@M M"E(,0` 2XCYGfj5$Б-G|DSq-YCt7Bfm2;Sr'B3sv[DrK0TNKHAK.QQG͘^R%Lҡ)~DaYYBnsngp!QdQ @뭼O=F[т>q_NTJ#G\Q=[8BP&blHEzbS @uG^UMyF~ ۷\xնخ;OZΓ۳94wCg3 b^FKd8}EۋOX=)")o̫%i/Z^S.Yreң(j8[fJ# MI;ß1+gziX4@e(!o+Tv8x|y8`8+#g~L>=Z{E <5-Ϙڴ557kbiE[O=& #7y504 u7F)oS=ҌzGVoLW*u^h bCC0s₠'^)Ca; %w^Ρ`[C'j茉u$|A$p#s<#Dy hc"nW]ok>zk<1磓=tln1dxac/3l͌S-l{/=jݐ9i$dД VƝkYnⱫe OK$]1'qj=5'ryxmqիCHq$&f7B  jo 6ms|]ɲ &sR{ǃ\8scc InL*$ <#:j>1eJ*S+}  Ų}  9{gik 7:Py$P$cp|Y]6;8WJbBO,]WRXPVw&DDWQg>CaF6!TjG?ؗӫ igRi6r:᭶&H]Ҟ-U1%(1gԔ7v 8^9ck y)[K\i_,&2\D Ǚ-$%LPk%%c&EK&)m^%BȩIN@!FXTNxI8dJQK< J\ɱ{+vwlVoOX aJi%'#h.4ȭd_~v]ur^kQ0ꙥsV9zP9M  9"턷S]sN㧩҃ e9\7 62͡ع|o(jv'MWT B <S\/GGZXZD OL3Eb9̩;GUe/3٧HWb% PRj"֠?LA(*7F+BN2=/O-!TsRyM:@ l8E ׋-bLLf oLYlޑB9JB^ }_Nuq !r1n9F!NBakͬ{Hু^;&PN(O˵ 'f\l) Ց Dׄߍ2 ]}n@!~,ּZ 𧓞80zjA$BmPvq>17*C##~Xm;wg(|gxUPc ;zӲj#Ӣ&T((" g΄mc᤬$WD!0w(OsHU'Y7BFg~^V] voI=pX٩J6<6M%mR$v]9C"GsGS69k5IlZu:DgD[P"i4+D<b\pv"8bҗq;.c P6Ϻ7;o][mSN-' Ґ;|\\V弽RBrDIuH2Bd 6X[J؄R؊Iז5]Q~C5͍JҏO&~=Au:.kW pgY/p'!(fJ6-nC,8+$iAa*S e= )D%L{6ĺD mT K!N:{E2f4%D*@pjT,eAry QFI{ ϧڐe'iuOI;k ۃP;!]BGho*ѫ:iWzuavNƐE2mm$jԬ! _*m>Y 4osk#:6'WlTXkiγhїTCZNj Q95MhJ@vAbEdR b2v*J胴@rRGΣqQ;CaLOJ)"0m]ZZ;rCOǰla7ڗ~=fmЧZLef@6), h>jѬ3!. !<^Dd\n[-X.h"$wn{}e q(E2*bx@bV"rgn$R'SG(ZU1$Y.ݤK )!h=KBAqRȉ5e9km9[Y%W?^MaT|YFD b ZI<_ϛ 4s|\F #QZ*@HlhNJz =}d,{4.1#P3aV)*M48z?PKsChb,;)'SbqɍJ j$#*020BUxB fHC8ZQe=67V$>OW&+Y3UBu oFۘiMVi$Z3 UJ:'$BH4jϓ"+2[cM.OEZc oYXe&$:+#Kɘ@7Q8(BLr.iģ=񘂱c7Ǒ8-K C tRTCI4zKVo.T9j& ݐR2IHMl" |e$3[H9ь+Ή,PLM `]bAu"z7]:p ފ;^2\I#3O*O p HA)AĈZSNg?O~ KVRɗ⠓bmYC\k&EVWqHUƹLmeYm8-/Fr7~Oܗw N,e$iVςC$~r=mtv0og{T0ah@"pE'>GIR,J($depbl799520ks g?oYP海xlw]Nl)βiY=XѕM*y}(F% Pe$'(ٕ)p]jk:f¤nuX+naxL^{q{# iZS譕/JҺ5y&;[$ÇNy5i^%DQ9aW-'5Ņ,8uI%/hZe8"Dg.O3K驮j˩-ن$d寘,$7 )AzԄʞՆӢը1\'=UDC/JϷ^>g>+TS5 1fi]U,G F꘭ѥhrS)-c\Fjh;j]+'@uK͖ǿ,ş6 ~ߌ' GX E. jnΈM`U)ӠM fwrӲòTo$بL "idhP }jVJ{~QԐc!EAP_,%{^n-tO.*Ij=|ꞗfݳbN#nKe04 EkJ7)P0㉳4HȎr #5i10yr G#322eČlP&jCaX(U< dlst%/SܴTu_렌,\W& O[J-'i/+C/Ն}e e-fY%ꗛ - u+>Z Q[s[YۯyD=ӗ ʹFUnq3,JqQH١ZR2|RWLH%pFaNHVh8/.S#G.RIS+p %EWVp\SfUM&ƪ|hvBs 91%MS[ԖKLzw]/q;6er u֌/prxckdzwNm\ͦClI<Hnw-H 4Zr7=m=ι ͷžt]2ia]sAϛ5n7wmjpͿ'A [EYet|s|4D#1NF˱.nZ .WqϴgR g^odA3/.?ëoշ\ZcR-f <2M' +:l$0s+Jq5s)tQƅ q !K[-[NKoRva ɺe[Vzt|p@Ym Ve)&:^dP,sRo!$/:@B\C3dZu8E-UZKh5 l,()4rDrI Y8@{9$\~l * fuow_YU"OAiĴ1\2f[&A ];sRɑY !9;? J5a$fR,b`J8jh ŧƗ[U3NVC>,ֻv{yzj|jJMwm_&P#;ͫx|x)q-\MOcfxf,U8~@QYhցkd^ h`'J2F,l&| j-%\%o0 Xx{wf|p 68Ps cױ 1TFsNw1^V[mj;txx0?\(sԻ_ocd4,M0O+7*#\} j߮#Y/y~b~~pJ(( ONef5kDhTo/Ù{ $' 4xJc4C^_x{Uo G'$?<~u݋W G::~̃?_J@߷ۦ7ۯhڔj[44tQ u;=$ ZSjןڄh!o̖vdwPo@%pCMӚwkP_Lv[ɛI)Qۀp0[ĩ$>.B+Y<^AC6,za1(ET:10Ș@>;By$5&ژaSdSIݟYL&3ﳵu t_ů/=NԆ_L?@Bz^7 q2O%Ja*+" 5 C`o`R_3ʽz𫳶FVE4T2/I3/ٟ^V8TV&c!$]OYGN4 pV\#n\rul0>tV4|>OŲ;k/n q=wDˀ<.!G0 5y[j_DљA|,Ap_(G1x&߽ ?Vφ|N3GY}:U$? c~lV')IB2`d_JyianUr ~ɀYoQf!3U?kfP4ICg$,8 Jó5_G AT ?U@(AtW P)XO/+:qP+ԇihkR[h.E^~/XV7:-#bfNt{S!*LaΘ{UB5l$9-L!:ƶQm[m.6:';mW}:NDY"{lΠDuSFvj|^DZ8=dZiVB۶o51*wūjG֓y-} 9&?fR j°Qhü%vRݣׇ !ov:WgȜfWMmlbFQؒYaC™y8,neͬ mJP = òpR^;)OLف~_OQ\)/Av'/k>\^[XaB~ePuXmFX(PCfXMm8bE Wsp@kRݢi2c)u֟f2\مsOdpt|j aPxi[Z6YКͶwٷ 1 4ӯ&\afW%e2;O}p_]DCx 9F-SE#~;y$2"4sal$ǂH41TNI8dJQ=e ]Uzto'MV{o#y(T4Ϋb+^>]([. 9xI()ͅsV9ȉa$2pOIw8AѢ _yƠ;%{%/v8!Bm{Am9:[:;'km+/mS(ojo%W %25:8Ā\>QbAsX _]xџ+rSY)ןOc),V* [Oc0'XTo ׶{h&7y?8m8͹7%rEe 99#{cԻRAt{7WB;Utxxvr\ޣzpza4k|$_D?.Q%;K8rvS^KXVآcZ>G-QLS* <ۋ./[(S>LXK\$HN&y@])1a=36՚Iē\x=qh})*0Ku{9,4ba9ca3`yE6Gs>uuyt>.#pv״`Qړ ϖM) pK0?j0Ћ̏MC*Ͳ}N&W][6k4]ﭽSQ=?c骉'[GXE]X<,z^LGF7m"WD0й^Pjyz rw9.Z%%[C?*ϟ1'2 #ƘC =3Tb"רs4Nv$ \y$bUMJ zD#EဌyU [1|=K0>n .sF o8-0BQ>0{6ZUNW]:@Lep+0H$,?q!zS7)K9bQQ!.(~q$d({P(^*;}ηGRspdRY=`$8B !rȚQ.pM6WTtLAd5)ƍ{M Q=(OХ\ɏ2DkjHaCP(ҵ x\'$R[͈GVp"e0i)߇B 94xC0E<E'!GU| E+RZ)!! Jip񜹏eTP< 4x '"JG(I kgx"Zա}(ϰvGL0U;j a]`7߇BJ<)K/``B1ghJì#(h8|+roP(^w 1E}G\~nj8   VD"OǮB0\ 50r)g,0ʨ ])O .Ɲ gayHHD"ҵԃB b/8rZ890Ѹcר!:)(8+ιZyăd Nc"ud,#*j@|U Z"{P(>gN`Y=̕fINǎׇB R[f"EL(j9CGyF}(Z2e6Sr)sW]HA;8߇B ҜEHa|9mE&߃B FcũtȀ(.owc0cCa{cM]TEi!Z G S bt6*7)q j1ˑ1]<, ej4)PKW yN i#"9`*vdQP<Ǩ8x`&c(Hd"$u-a |79VqS ^c 1EJ-Q%LpUs3 i 20j߱ЇB!q$al]{P(^]f 'z:Cy"~6 {#,pHYGNI'|@ ѷ[oΪ2ky]о >{>bjh-ϩ- e)8{|UzPH>&!C\;K6EfG %\-yHgyFzHglP(ޘwmmK~:2RWẁA6@v89g*3HEd+[=$%Je,5EJa2UUWWWUۈR@x<a`1 Z;&߁B[gXg0ޛ' bC-W1>v} 5&ƼP"ec jbڜ &߁BsQ~SjDA\z Gk <m dsQ:cFܑ ̰DSjxJ!5H*ԅB+0Y>H&l`Mrq}0\vtPeZj dh$'ڇmtiG.jL^[V]ewt# RG}߅BT^`"%8qii4.! m#;.*L3VȎ22daTrR"İ ](TژO:)7ĻX@c,V%^0BI|12Xƌlqړy&pnt1jL 1m ti @:ebMv6j<8E318+G 5&/Q{;zD'cuIڟcs9/q dz=ƪ0p;' }4A܍omTDlT҃;o|S| .?~܊Wx*χt>P]jD_5_/%ÅFImmYFBt>{cg9!iV3fҢ[1viHcn 7Hp`{p )` yt)(dnC_*]DfW?v~H/;zq7ӰH ~ 7M>Y/H#,ƾ E.B0jnd3c3GЪRfA2}ERid`2$#͟- YZ8 Xc\=zH'^4f/A¢* *1H2{jI̚|hoe1hMps<> @wȣ tA;ԆH2 {2rb#M;By 61]*pd.8=Ti2Z#B QrNB$1˹ -=B)1S^bt8d$zD@= uJeieАNļByOiiK2.~J[bWtR%ԥ)CɫHs)AiS ^JxmP[ Y5BCpH E*ifERN5mDVNjHBiU*NR'2 FN`Ģi3r˩Km4pN4 |YQgՆg-z0kurQRQVd 6,Ao\HɁH?bԉ$Rbt=jBBPwSJ $9r52DoIGpPb 8a2ѺЛZL0~tXC1B0f B`g:rOeD䬔=cmd`ƜN&űkWGkCpmO7w[]ԴO6; V1c4XV^ډ+1fH& q̊e_%XWE >Q|b.)U G$Kީsx3Lr4%L9;$`—8#hzxԃǭ1vT7qcссߣjZkp6Ԥi&p=`G$JkH8e@Blr`mVjs]&9&w{O%ބ0x3/7%/޾Eb˷1Y$r ϫoPyswq* P7zAdE:6+tXVkL/K?]*_˴dp:9Gi˽Rr? otxy+:;VMUt[9:J~*eд+z[b0*bʕu_-Ac4x|64RjɄilhP6GO+޺GŖ>{>'Ϭ~bJ%I2hgFq5VKqܐu,r XJNns$ET 5ʌ) 0>&EV WUTDrF$0sj@i>ߐl`.egdΞ.Å[amӗl,\>Ǘl"z9J܀ƒ&F)xp 6j-$( ڃ!k( BsbyS'gzj-ZGTJI.:2eiek\=0;puz``xfP9zL|jJw TjmywpY۾ +i2`zh}Ҁ$|J^{%:ܖ?ٖ8-H*"iF?.~oK+Z3U۲0S{B}oB4÷SfoK^/ ^g.yHR,hj7{("qKn]P,I9yGfKGKu%ͼSTZQe5zXM%%7(ʊ'o%Iwgm|-2^.WW=uUg :?P>(wm0q!ׯ+yڨ_޿{x067@ªAIO$frݛ1!Geh4p!]OotBX彦wIKv ^6\ZnvuyWN;T{3n \$j}%潛9z%/dIY+. t_@qѕO; ?.XfmcxH#`UJQ'; +9KQ\'V]\_ޥ[PvB,Jmނo9 a7by2|h9b5RAJ}ЂRhh?D/ԏ9jf o<>Cjq IH3ŮwQِ:ky:`i''?"e-;9;ZQtW|P9l LҎ JsF YJ2%T&Kw`L|AFWS2X3 BJ\k)r@mnsd6I  /G1ocP!=``=xɒNNj(W/]_bg:Yh tG;5w8[>w~~| 'I&'1;QNu#ܣ|_f%XDJsnvw.I~ByOtT23g4Mn?߷f' mJ ͆fo04%W= ei\ͩ_Zh9ČrP Axo 'q/>%v"MoHR.8NbCm$J\\z,ԡ1BI5^0.U>?pwVѼw -JDȘ@>;D$IexC":M*RM<<7-\E99*KΡK6~i?K x {^W>O UFZqFBeך\V[цڿ:VbspރJUoRV^7PjD÷ޑjP-BWK^3*}A{}qQ@!~YQY-qV$O'=qܴue":^Ub%@3cc IEmL Oă:*KP[BeB;r ո&<:;$ "Dy>sP_0QƐHWtN=A|k:Y{t~2ghW37qϸsMhʹ"y59('[)QrK}6 :2B Fnq+ ,C _AMFEl s4庮V 9R|5!*)s#j%C+Kxz8sDf $-%[>;kAr'11+Pb9%3 Ea)@ҡl%2Eq2r͕R85D:$4'#BJbِ/MnePs!pcJ"8d(DMQLlZ5"JZ ɂ ,Pn^ Orz9 uДD@OiZ@F 8r?$JZŠEk'v91XqɌ@P!0C|8"lCR$* "͋+ 4ox5I*MD|;zJI4Zb I!$ڳ,v+2 zDpnRzt(WD|0ҏmZ)M68+riJdo)l :bcJ#EyfA-{c9`<;xao z=NR͜Nk5>Tuy9Wpy9d'D&Y`ed2 hcDVR(]qJM &>XhPm񎪝appA5jh"X1)/ 7!^K5( kMg$|@ZLm}M*a-T`͞ZC{o:ۓu3O욮ZxWFLo-4ɕSR35\lBK~գ~1R u5i]f%wA{}]Ok nB0 V7{w})o@~д5/'wx`i+9iΖ?kq\s~qsAZNC-6ߓmB2c|MmMM¶&{kR]uw D(9=K' f*˯ovwPxwRYrdwyaV㜓'|Sx 6aZSetdr@D 5&hͧU69r @"S$H2& R>r|(Lk)E+mB,6 Nƿ [Ykp>^Ny> nww~֣JPT*jFLu\+@[Ur(񌴹Rou\Acգ8x9cjΜ͟867…DО4htJCΌg|p:`.͖lxY؛=™ifT8kD2x {(Dť8qJD-LT'a֖i.^ayFŎp=GFɎ RD7T̓oVg'kkdj]Y]8kqd;9;pzON*G<+uN0:S>˯3hVEw'F3Y2%kO@)]SF9$cݞ(/ObTf$rFrnDtAO o]^g_3 8O@~6бCj?x۹[ԏrݻKb.9F'~F8 cpB46_ȧ;WFܹiΤyv1⿵;7_//-G&c#KO2s NNz\;;Com/ iIƑx|˦ah04H*eCj;w1kAw;ݘx8卣Rz"FmZb|Q>a3 $_C u8 &Gvj6]?Cu}v}?q;K 'INf5:a`xY>VV,U"9Yb?yW$?ڦ96iq#I8e:jeL|Ňx< &( ME)lIt`s_Q++tern5N8KW y,'ep}ܗ /oXxy#z@vU-::@@ -F[ė8CKz3@mHsF"F 5 BFQ+}7٥'G/?JHO='z&*!SIy`|0Ew Zƻ8P~]֘P#XgN:M ~v '[ߓ+pdSV|mgp{nk5ynkւ*i i))A͑ۛDM!Ϡ30(ؓdjΚ e`5 ޞO4[L]%Iq8/?k^'F(bV^TS[igҤΧ*,Ņ;J}]r >fv U/Y׼5tWWۚ?4uQ/Ÿ:cƠw կn)^L{N@mbAƩL@(,DzMKy:gƌ|z]אe`ݫp|:s^673iѦǒ_|_fA|A (!p SkjLYQ),E.qg4Y|| 9m}q^9vxo5kagMXMz7,g2sɲn9NX-S0SLy7^%NV&$Z6w No>Bڶ<=m |cZr$\୤{^+>ɜWeW5JؘRŎ0UGkowl_W~ݭtFW͔יӆ;4YM['5yaN{5 Im{v@3u<ɴS*-(<E_E_ӝm"vQ@yU(jԞ8y 9N F_vVn$DEuVHYI\Gꕊ\[@h0WϞY`OhRvbqհ¦U]vgRȷ^tK01IY4~彀Q'\UJrYq\eMUNMFٿx3_o=w3pt,5ZX F&p` rŝ[i%1HD#[7̝ybkֲm{srUח_/9w"Di~Ш!r{(Z  @Ԙg(+Mwq,%_rwWqD0/W)Xszv%%I#Mrwɩ9}:F; :3o+i(9R*:Vj6ӈۆJI5ZQPkS=@h&.LDN<>z\'޻y"9&ggv~!ډys!}PFg#:MI|J}tt(T6 %B8{UEwS>ഀL|t2}+(ʖNE "蠞3igK7g?M-+DǶ^WmYu)&e607A\斞BYcocl.bsABY"`9LԨJ;K1O}>qb&}':=ĵ~y3fW,[.n2CP3A=~zdvЏz Vj^]ݚJԨEX8#0|BT,Ք\,'Nk`cϖ7E!obS閪nѱ AIh;ՆJ(fm/5nfN:r3)yg̹ sP2'lТ{mxֻuWǀ bڨy%{"ڞ7 OnmI5^iMR4U+2瀟~>RٵGjWjrI&h+*K3.ĐT"'Q(u:Тm] swqrw~u#tܼ\rvv(֗GWn=IC#Q.M1B%jM?'ILʮVR.Fj,9jRH]$J,F%H6MhUHT-+Y83g}q||ttf'_=g8(oY~"fw%JS?W;JCi՟?=zܛbvAz:!N{;Ərϗ]=_vr@-;ZԔSdغXB_UB2W84TWwq-gozm 9~Vb+$3䳲u,;9Mb.:c/;w;sUM^P{! SzC=ʫ`9Kg4uaMY]:FNO.y3Lofn\x`eMP˗GyIN~8k}$38n#TW5?>B9ki}fr[F%V߷X.fLN'!n7ղ9֑=&I}1 N\LZ| ìNDūǓeNy&71+\vԥ{>j ZB*y2r>4l?z8͊0 _ WKMt wzyio%i۾Pf l.߭6x" hl==+x-܅]iBpRy?oy{u+İg `.3_ҁ}2&'Igv?w[Z}fzs:(շ 7w,˽%(:.-+\[>FrI\us+2B?{΅fq ]EYbm,O^c^%h$֡*1y!Ԓf^R*&2}Jͫ${xv: p _k-Ǡ 어*CMP!k-xJJÈy:m蛦V$$|2nZ m?ԑsԚ$C7Vidm)I9ڪYR-F$\rPE016c7c6]PʦXsj-'|nhB͇f6[á)&T^҆m@Ū˅7PFNQ!(D]n !$[ {{v6+<~Bm=T)#p˨+U{&r9\޲ >H&[G ^%c_R]%| xKu.=oΚ{TeUM)6oFS$.xڪQ 9$עNbpۨ#1,jٻ;vIcKY %H$j >9lu6.P*(P Ki)bE }gP_wZX|-+wʰ&TQ]k?VZ}p.idȀgm q_NmTE1kQ3FAWC; B  d5s7./Y_>_Ws.o\ɈzDĻ ئHǀW(yi>(6/[0w%%P鲸҂Z.UF}CMZŋ 1A-IP$RDM+k2 dV8LtqX>,zA&H^(28`G_MƂE,TG7`j"le53Dw <}XoݷnzH *>C,-VZeL=U%E;<Ţ<$Jxҗ)4<5k)R{pcU&-h'ʢTǎn:L>^Bٗl;h .X<Եaڷ/@zMȒ\\;ycQm \g$0Sʪ7JQ`?A.j880"vpZX0a!$ݔ(wUPB@&:c!hͧV 4]o0V#ӈ`LF#5e[7 jTʥpӶOURS/522ߢtEQ$[ M(U S}bȖZl/m0ڊKz<rΝ #O.b=MӞ֋:3KB`cm֐8lF(6t nj~)ۢr(xk[1MkiY@Ed5h"fc.`ށ`O矆HzQ.3ƂD{Z4(QTpJԃ`X'RdcBHzD@z`}VNm.[dvouŠ8ͥj4U7K,JM3*ap 1ep#UQ;z}FE{P\SPF% 9FU%p `N`c(#ryg -8֢U VJK@9讚NT61W0R6Us&sOHהuQAV F-}k ڊ݉0ڂi#bU>.`(g02S2aԋ& zȕbO0%7aBFг,s ezuIR]b7_9eTǤ<,M3>FCHS(-]0p% &J# \q]!2%8 N)!Հ. ;lG`I($e%MMmXNނ.>ʮ6έ~ (@n7PNk.A 7?P `?KOI RX(v[V0@E%PdJJJJJJJJJJJJJJJJJJJJJJJ߮(SRM=%P_cֆ&n(~J V!JJJJJJJJJJJJJJJ/{W֑_f'}I$A`Fԣme1V=:xH|HĊW]@ T@ T@ T@ T@ T@ T@ T@ b@V# p4H $cA!B*%-HbV@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ T@ r@Y;kF/C Rr4H ɃG\$KDa8                       !փ8oM]5⬛\?KRi}' &x]|T% .xK@\**\%.^ʖ~E6R,CbϞvVvf'v ct<7`j{>EC?aVٽ#uIrE5Q0*sOJUYo* 0']Ψ-긤'Hnk[U3'Eκ) #)RI#$e)2,.ó(uR#*qMaW5I[6-Yk(g\YèM ,"2F08K*ب4p`(Q#Q)))3̱ C00FxriMUrFoAGhJ10<͌ERȳZZa?$Oz^m?DXvB9E?Vw;Ruo|芈):9X/( Rv" Xu ܶ:dbQ(QOTM}zc}9g<)/X hP7HǂAj-;t P+QRjznϮ2 DUJjS le$|fQϐ{ET1kǴ,rrK^Lb=8\i,r3Oʂ8"Dl)SCh o7S|@ITAoJ=cCO #4`4kz.5\Mm%ۺ»w=7}uwޕȍCjʡ߶k_=PA Z7մa9іNmm\ͦChYAjƭwiyl󙳲C+-7p0rλhbd ֜OtXBf܃4JQfSH.fwrW ZgU}`,82? "Di #zn%LDCOoA/.{qg170;i= v ~j%u"~ܰBͩ< 5<`%'u-q\ v[?pe*qkAGEX-d |r0jYG%k"O7>pyRK|69_.9YcV!:4:]RaD hCC"9V͖;^;t`jL`3ThCI+V$)%% l8N8MY%at9󣠤a-v˙9J+/`&&I)}OkA+Xᮋ>TvyG/ĵсu;Y0oC+}$HKJH,ym(!zxtܴ\MaRkRN"NeQAxgˤbbb}Q9ZEu/"gnjR7bKλk^ˍ6bҜI6@8H<*\DǥRZ&dۆr_u 31n)Th4tpfb`Z%9mt‰l; BR\?ɶJAPyl /k`6h~KTPlZ(.2W5&(vF!jEi]Yt]_jkETJ <#GxÙ^VF(c0B'EZǤ0!"&ȃT2'K %rcJH יFF-Q4fIPJ&7iʻȭ+_O;?q?SS4 #JޅHok* {"+NI9MԙEUbjBhs?4ĉx׹JbkW 5 `h0J('䊜k9čFOFcP'oٳOHr*,0V@IK  r5[/o³{T7Q-灋)vЧ&N;WQ#M};m_NO˫GT0c!K#pqzB.&6L>qK3jzc~׹1Yδ9&Tɏ̓ד샳Ub,WKt{u/+reWDzP2A%^-o!F2zHB?aaY?*4 VWwAݿ/h9f709Q<|F]7WͅQGxK` Rg;4v#{;]PCM2FS0yM<KM2;aT'οҁ5 /zP;TI7LU>b>_@Eǿ߼~oSfޟ7?}/^)Vu? E߇;Cpm ͇F0mz;+IS^3nkqelTa f#뭃*ńǓHݴ%vMh}N M2.+SGIP6jHMIIцv9 1s(ey k#JV) \Ly2 0rʡAݬO|%7k-$4qd"WS Ad@Ha@ qJY{RSQW_J3kl&2YxPfxA/ұA*<AlBL`59ٴ[+z1"|4؄ lEL1P VƷ]+-rIPsvM1y/ mX 篩c9qdz0~̭5C[7b=uDOgŞ;a4J<Ѳ6R9@ېs+)k] Q(2ZU!wUzp򽌴c۳n^P ܈P,oܷv~ltO߀eC;1?S9tn~тau~n~?ԃOx vlSp13x0>_Pz Z eMkkeaplL-zH|~myG޵+bq1~ C Ny^ DcYJg[nْc]bSV!;I7EUYU_ů&>Uc 6s\Mz'N euzȞ;?n/_Aluk7׳w}w1_ RAy~s|< L"&N rT*V:ᑚ@' [7:iF׋e99ח{\ܼFv뿟lz*$-+62F$&2ux:D3<98%&(͍S/)]6 V vͣH!z]S폔j/$5xgw9IEp'B7Z ]6&0jEͲ;T2sd){A51hH,L/qpGaQ2xEMҋB( bmf5 (D.\dhUqú @lxJFbP<'Cx&=C+ m=-: rkC+kO'0OA:ΉWW%">r0rv]{c24"|E++ eGf(SLqBDb qH:sBTJa Gy2ˍhga{\Bҽq~Zܼܜ]~+\}Lk7Ʀh!zh2/9ӤQ(Yc0MPmHRi#VWvOm v 5'moB'TZ G3AOg!x%I?ppUM>+7 b (98/? ŵ8ˤJN%ˤ u1y\^)Q-g.zoMPx}Sx ù"{JC!v5g7˒=ϡ'hmCo٧RY͗d{v,nfyg|}Yng }\>}Ƴ^Hr-'=+K8Qiq\.Ĵ|ݥH'DBKT_ @HEj'q&1ɜh̛G~b<5jG:^'JhiM\R+e <Fl!N!\4ZJrk)TV2ϸ7@BgOz.']weXA|Y?"~VC=<|LjKD0wal8K!& Q5"׸S$x@N@ (I 4pڳCRO%9Px0)JAYDr7`Ϋur(ky2{KerƁUR4%(D.D+=9vxfR5c+T(٭O7dzt6ĻNVj%Slj-|z3‹f1%z"k'Z@F_n.I-n*MekGDk|bu7 0$`~~tȓm%1ȯ'ު9kHNyℭmǓtT'9u P'ډc{fW*(g[rwvmi_ޑ##+AP| UJT *AP%A FPfleܨNeܨq2nTƍʸQ7$ԳFʸQCaeܨq2nTƍ 1ALE=%,f'C75TQZ¿2P,%Un'#;5Zr+Nܩ;rRTʝJS)w*N]{vdKJSJS)w*Nܩ;rB@*0rJS)w*Nܩ;rRГ{=Pz8Ћ Uh9U 4|#q X pLf8wlʇ@ȀHl>oNmJsID(XHJEFEo=l 6 A@| h->()9E,!RN@NKogjMY+I_F)nҦMK" [n79KM0ĬDXTEH48v?P1F`M BbP/CIoOpMF=~b8qXŭC*Ft`e)! 6!E p1󸸹~ظ~̉MMԖ6kdwtWmmF{6hCU& N\UJ9m0N*!h5d 륙C1ǧ6c[ϟ[7]pN;:o{|hb[6lݿm]wwJ!tloyw{$$=uqO2j%<~j.tvCmoۼɈۯۤJgZNȭGͅ\86Ѕ0nDG_:oD( Qc1FZ4Z gY%5du.w4~\̳֬7D| 2|}3&=KN4Dgqcrs[yzĆФah4<:?@◤eYIik!8II(CH` hu,~JQm"Ye2-iWȁR) *m2nG)Gfsma9Ϋ-<-\ۤZ )=ws3?=]_|-6IJ8Mrb\\.0`ʹ8E&Rix8 9wq&WڣJl =blbFaU1MzmtVXRV[T]li{XT%4xg$"h\ϣF3"x^̔7D Pb^:Vul]Ǭ״@(JMJS`)+zoDTQQͣ&>49/J؆훪iUӪRNZs5ObIJd#%+IDZ­Ce4G?zѾ͟TĖSDF+u&Xa#CMO:Pzxz܉"1Eb'&&M$c2QI A0(IzSej@hU{-uAĠx*ZLxC:M䋫%*[yG5+gև|e֥oBr!/9%+ߎ^LWfkTz$f?` Z5Z~: ,l^ھBn^I8#x5ݽL`).)ju3/WWϿej{Xi/+.P/Z(Y3 u)YĹ"><qxMS {Oiht8nnۭv|?,,ӶC&{ms?]j?]::. ,`wϺuf6w=~Ou'nfޓqdWd'-}0ٓ b{صרS">H%)X2U^3iFYA\0[Pcv׻Az7 Ye/D.H6)?6T(:I98Y[(=X^;bo(u=em'( sPxs0S)CdI"GraP$Xp* *Poj8rE؄y$ hc()A@N GgMgk 5|RB[xwZ,z4נ|nQ; ILH:%c3V 6RIO:<]zԺ$FwD@4:&́lPf R!  0B'L= J NNgDU)S&$ش`'G K%*004JB(gHy$g!]T:$":'LsScč〪ErF>`BҢ##5Hh'y:.x )4Jy"*#k$HLaBrvgڗgeL%5ƅβ(XE ޴{q&-X nU3s"+g)^JÜey]*ZuAjVY3{2D͸)X?La,v)F"rGRP;5aDi ]E-,JC̥CRbE\xkUs%1Ƹg|  /.B-e{U+d+zs3f\֨3<1{3Zge#Po7?_]VΧ 9h*&DqeӾ,1KSAz=**Q W E##1>EÐ0xiRy,CW'`bnFp2f=Z29FQ Z>j=E.L\:L40KoRÐbF WTOXm7CE~~ߴ.pwip"Iʍ̐@Ne~%We=l=TxtTPrnuK`@D~~{ן?~gL?~kЃUu=mCt#M CS&g]!|q9jr ̸9]B()m rduPwE0}۴wy}[qGn$I-#Te<܂ Pd=ǘ zl~ (цZayZ"yB(g)x`颸S㤒|zʠY?LFL҅Δ dH OyvO m&JJ-Q9 @wpZRYVX]]}r8a"ѓ=ɾm~]Nl,FO~(xd1'_g[ad`pV1Zu y!NL+X%`4N@}iz?_3qZKcM"¨zu L?* JK^# $g\Kr1d ƘJhcAl17\ocaЋ00iaNLO627ka.غ[f;PI2ǺTbZwctuHr׆:}^]ִsYV;lk`ME7}uw؛*v`*_u]!$%qɤ-d Y4ef#hԭf>?NۀB »,-}6_$ ?@'ǝͥOm2c:$Sۊ})E{o+g^㸇'\0'խF-M7:H  -֨U;MWȕt%ol66LHN޸MZͦU[{E2O!xx $>0`jw'p ? ZvwcX+Su0*+𡈫{q(^TH<zn֩ݫUy7.\"ͼ8syJkCXnAA!+Z znڃd/FE*w`uN*PքL %5(oV1z4T^@u< ~fQ1[5I$RN\^7:/;n!RfԫqzX y)ˋ*+Yi3tny$XLjAFI7▱".D:,ͲW ZmMIdž`ZlTF 25ooxh2Fش ,7&.ޢ \U ~m,O L[Mҍ)X{W3ڛ)s1JUj"%Jvr Nv`oX9Vͳ%R)Or)]7'JBh2,g'%+š&e}Mch&doXW \)E\%hwq$GS{GqrO';)SkBR.>[+RΫJvgԹLN92INXgN AoT6,N1ɡ)v0҈ 2UhyK$kpܳz:>w`/' ,m놽6SO4',V%&SiJNa d'e9]%1 m~̆_>|rxst˵vM1"{\j\ѦɫǷ$/Q\-9尣cJZ B۴+VzgB\w|w*AH/'o>Dc "g>Gs`1юc}гal{;bn5gd>,o=Rf]PhI!Ȱ<شh0%Tx8ڀ(Ӟhtr 8M(rneZGEdQ0A[R.k%#RHDcӻz$t~OB {`?LoK*o ]y=κpn^G3^.-χ|&<W]'Dg{zݜȨ߀Ib1 εޕq$v>/C]oqB2%)bW=Iso..fvV15jN"߂FOgH1F Tdr@‹5!ZVT4:NyQTֻbl=7FT92_Mg;ߋ>FӼƂl7Xkl֢`WtY4ũ< .x:SOjTXsI!佴 $C=2I\ɩMB u!){ܡ уqpt!+jTO#Pu0H=þj@:zf AܕBRs_]ƅ7-:;h}dE3~wikr)ݯ;DЇ~9,6ϧ,M;gc?Y&ɺ:!M%-'5fF{c5b܆;zDz'CqyD\]8;p"V)35lKs\h1 Y`tQ*\N|Z'F81L1i_|9hR hc a*[&(K4v|Dl- ="{뇇57/kCD!Ҥx y+_-q+. c~{!Irk{C*K*Jq.MDԺPR*] 8K+魎;˽T|-?;'[Bã&+d)B7kH'9"zA0,,Dr>vPԊ1줧oP=?0y$ @NgwZ0D |3l#22cӫ=b.&nv:Y+G+? Ulc7ׄZno/t ]Xf e7Fc Rm9xK^o|="416%7=ɱ0_5y('̷ {f>bFsśAf>'s@d(㬤 أ{,XGG(E'դ(a%歎I6iMnB<&r%Rn=MlcF'-o }rb4R2helB*,ɇ,t9z=Gӑ ФʁSZ] "30\#s9c k G* l!\} /3˜Pw*֠26sdWQn:nAb~N~|Bo4Y<5nߡedmڭ-]ϵ].mk1)\]'qjQ|-\n.}B.ڵ4 ?_ n_Z7֛NtlIq7eC-9;GJPoqw%v]h~4̻x7ݳnx%NɺUXYsI/jܮ)76y;ǰ˒YfҡMbuBh'l43U2f0Mn@ؚ>2)p"(ޙɾ-5Q^Dcb}\-[ې~t}R߽[;ٌZ'ӊP.'O`^#eb2B8\rz>x,BqΣ YF@Ad>ȹ@x" ԍgF` Jnqb"l%x+G@|cTZdP)l26ބۤEӎ}x5:ypY9Tx_W}VgJ1BQ{+RY䃊NnR*^[dp<`[S!RY[DF2) dsoks>-Mgd܌RDVʑsRvKUX ѯ>yI7֠?~e !DY҇R%I&/t%ZQHΥhaJMheچb$ AHQĦ&<5$#\2=f]3NXk:%v ӊqy4Ru*g2,<8g\9 TVZ5"`NAyp$Ut[t:{-yxPEXkäeR6xr.4ϓO/.le=wRԬY,d\T_yL "~p]L} *7fV'otx*X+/0%1폕*/h$WsdƱ2'REɺ wPHZ^Ǭu~|(,Ete)ǂ(Q::RfQ(` dcyɳb eVNC e $Y`h㠘{T J99vZZ1^~IǔcU~Gk˭)/ʔhP;]dI^b^3c6QC$k(Z,lVm93+N88;uQ:Ea4 c"ɨ G'F@-Uz{}ѣܗcKPg5Jt< Y9_PGǩiY܁/5<p.59]T!ΘYY T<-üw~YܣUZ WANaUf~ Sx&X(pK@qM1L\s2/d2%xq-]J.W,XJXN7]ӾY Lsr6 S}ltw[1Ҷh۫ŧyz7>(M+et0940Oι+JTF 0nh޺7k^\u:U`vd7Dq?y?7YnvvI?]!><Mķ_cFfHb0b0J*oeG|,Xh8^lfEaq `muu[|9ꤼ_FX`o{Eh^w۩R J{匿O I,!׋^7dfT/-ң=/f5"E;|6+冥jԿϖnb1ɟHD7?o;Ǐ9g\سH#uSO׽& gCO_054ZZaYiZo*Vsk˝q-s3nH"f#A*Om|?7q_>%,9ƍ$R!-ZS%f$,$9Ȣ1Q&0pcg$=a!]B"y$Y !\ACĔ@:c<#$K~rN:GUM|%j鸳t◞&~I tCY(ӻeC]l:Q+ȍ6L5XCF%)$ zH&^=z͋WV?<3NhqYGU4!R^DC\D6%Nv.ۘP9϶n&:rdsX^fЀb M2ED 7R|v&Z!@*X쐃MHGqWkt܏f1CEʾ }Fvm$6ء'BGFis8t|H[L$H3 s6U'"m)eh;M)F0|NQtBFj7K_|/!-z(kzyо4Ksf~{˝{%~*)+#i{׻mV wA=ٳpqS 8~.^Ƌ^Ӎ Ƈ*WmA[+\vH뉐n>K<[bSyLv.mZ (_x~bvL4ygO^eڊ>] /F&Vj?'R_qx;.^l%W N/ڙeYT$yY#nMg;_/^Ջ? ]ha ](z $ ΢{qmLLmr[tFhe\ݬe٥ۯZ֛ur~6bŃZwWd&|_?؛UȻe{=j{uMQczt5^Nr`YB[pf`uƠؚ;DMwS I+J,JwMA"P^n/-"b$XM]=࡬d"W6dL@3Q(BLR.i ;&Ud J$ՊDv;i:gm&p&Ha;\\ vr|-rWYItJ *4BPّgrGTÎȩryK6=*ơƕਯ%HxM2"W<Ab((֪mR$v]9C!\1T5d>5(Ξ5T&7uMMlg'7K~{Wyڼȓ;pG<&۩5}f腘B rV٪͝1k:y}]ui";泞4OYDVռuB=[W^Dε]vq畖a2o鯹;wA{x3n3?`x6ybt.Iym{tƾ3ٿF/'숭]a! U˭͟69mndZ@kRLt6RLhڠ.T'L3#ޙ컲F~|yDiL)+%_#?ޒ+^ ֿ Ͻ&Wp`{}hYItLcL2h&$<,jGgU{9Q#d|Is][y#8čb\pqNѾtRlRG'gp]N^zV{*]ʻ=Q&oM: +TK9EVj$048-eSs!$S'zP!ǂAV-lB&U3nθ]Ky^X2 qƒ'F]n*Ϯd枻ӛ ?oPpe8^;&Is^o('{%c[Kv9 `t.&2N)gѫE96<.Cl`;8lKqb&&10L>&.̱Mg7vQX1.krmv~K~h)X() #< uKRThDM!=(?LB*BF͐ZўG&DsN%*ȣu!EGlibk QqĎ#`VFBhF$AK5D%U{v҄g~ RNK-&B2JAG"+UJ"ESU#g!wD>@**gh aep% B0B# 1ǁ  )R4DSb(:H '\%5DrL(A&8 lI52N/8kTo;o% _S~ "cagmqü񷏽s; .?bG+n181{/sWaqzkd` h )(Krr$; 79ΐ^ġZh*y,fvmvq_T O Bl8=4='DHc&W('VO@}yE޼yb?뭺w>7~/._U6LpyK`rs+DxUf⣳H|˥; m$ZGyaX0@ڇY%G? UR|<4 =sp1my8`:*#GmmԶgU.G $,|24ƩFx~Ss?K6Tw/W_SJn_z?5O.j!7qoV-*R>h~?!<~?ppOO)ӧ?Y(Nڦ>ߟ4 yЦ8^34װ J>bW2_",DO f#A}P um{?哸j%vMn~Fr:jeLdA)HORXGa(I3nC]R<;J[G"dL`N!7{Gi@"h$ʣfeq$t:EuC1x%EsR=tvhڠWuK㿤!RR_P jn|%,phʊ*cj`Ke1Y{;?'jND`2IzW(!(yp,PIڇZNRt<,Y˵pAbb> ME E$:'-3Tbv󰔚n=i!l{Ә6IGqhv~Kyj-'E*^ު #vLڑj-1<˥")"xyc͚ ߽;o&W>ǫK1KċiVuj6]COm\6K⢞$gG8ݝ,wW+Cou7^(cxva Y~(\6!t>vn/rYܸ ཥCQP}_O76z.V44/ۛGgfWt|ܛ(]d~Cj݌&˩^N׃|̼P Ϳ굖`\سJK{n~ S?k0Dd G cRq]H4B1Ʀ@WI\{؄}V i[3 DR^B܃ٓuaLڴeYޭpA#bjb/@97vYL=z7׻i\ gwm^}y42jٞsv3=쐘 x^>{1;PAKZyአcaP1@CY @(q&eGb% zV7YwY~^)4)a|\jhazg.P]WMiew܀0ku\X(1(kkn+%>xIU?uoMmxKqd'?$ےc]bSd%:4C?ƲH&f1Q]wNJ S[jo| (vlz1)0HrDŽrB#=d G' i^-=/~n/ghMaa,bX~ĽAv2i< L̹:fwzWg_&g7nNYŨ꽥Wd2&f[Vt&~mmlNj7MӺӗ*-mMdd_e_CٺʋbQSClqWANV)W}إ#{_˦^}{7lLjZFY& \ec3ؾ#K2=)_SĪ)gN|DU1hΔլXWJXj}Il !k'!YXڗ``D32RoU53&.H3\><r[>Wqf=Gt/bM?}N_鸹G\Vјͤuω."u{+±{HlF} !j[ 4ر *"2k K* kń6[kZD1^Z>Fī#\'7[p{M;}~xsR>WH1Q]lT5s`x>nfGDؗeԸPap}M{3hv"4Fv}X~, ˶vGw 甇Ud:Ϸhm[Ax\ͥ2308j*(D9)h,6joz{}`q~ [kη٪Z)㢎.%8;"GWY%QQ0P f/{CLWc6S=t4#[( d_%Z-V9JְUM;[p7q[<u]v56b &&@d"bv/ř$b*Ш<G8$@ѿ|Yus]*|yq&uQA5y8zkړҪDZrxiU"cZKBe+غӁ+{}*pJ=z)#\B" LD'WMܥGWMZOW"UjW`ݙӒ 7df5Pa|x%wP⾾Q{e?~|ho,Mo~ [G߷U t铁&?iMJ0#LBv=!>N8W WDvE,͘iJ oKV7]̊׷u>Ta'2!XLH훥gп)1vcM@Qΐ_,/?yIKVN[{MlvoIJwL-aMcu R.YA:r{. Wxz.A &K_-!DS}닦g %)Mb?mZbOyYoO=]~g[嚖mShNnj)mMQBy5QlM {ِV[ţvAXUKq<ٻѓbE8L6J5lEQJżȁS"/ ̬*"LPZh.u *'!o邒=:88X`:Y7qv]}f6SԋU.?-b֣8;+Ј׫M!bNƵBo9"H˚ɜ;az&woU+);ms Y.0 iZݨV>?&O:,o-Kf(5V!,(! ̈JYqۄ`Z+^/OG*[,:RA=TaZNiϗklIx;ީ\Z yUyQy0Ks'S#Zc"xevl_к`=="5>v1l92574 *XjeP&q:L$Cِa1 4[JΈD;l~0/?rVr_p!ăq(;Xr~@0xu٫9Vn3iL'!! C3. 2*R|ˡw.*ZaWZJR,UGC)961'1hY`UmjB9m{jЙvg2sl~ %'Wf0YN|o&4#|ӛOYW0m:u<gsCmvԋ]x/.gsr*FѦSJ绹]tbRwsN[tċBw-/|78ys7CwunV c׏}26xFstXF07r6..F}tٷU?ETQ`Fv+2 ZzRػ̟dCN^dt)WϬ!NsFՈ1zimk&s5zJɓ5D\ٕvZѰք\z#c7q#fr,pƌXW,\yt!$/-~E>`?Ϧ/Uc9m7 ֦<߅dcA\voo*243(#hiK" dWS>TP0ؚJU]sz$/9Gڽ㱨#j 90h09ks1b`)$y-DMal. *䖙ʼ^7l?UZȁjH*)[:ZCLQ kVd@( C:2m)U } dUcsg 8G&t&a֌hXDa1b 5 phap?k3ŝ5ݖt}uq~~\zU;hx լ k:}cjMI)1f,ܨQIV4c$93iEls$1j8BHrJR.ѐ:c?{WFJ/uIyEi7wc^8H5Iɖ7xI$uxI)Y̊"220iT{1h0Ҙ*[`1 ({类j5"5 =)~ ꟉJ6\Nc^ >|kxԩ1\YwpEǩ?]^qVYwaGZ"+*^w)?ؗ)bGX)<`)"GW,Ս0\p*(tuUÀ^J/C:/ťCRbEل\ qnR4PUu*1_&ss 3V8ŕHբzmƟUzxtyu<{F̸ǣ=t8~Q|ZO6^:qSUW_5yOՍGͅl1)S\o^v뙥z:3Nrx=0dj!bž_nnZ,|ׯ`Ŵ{D}v.;c-할սec1}L?pϟ~tϧ` QpT-#O?"C|mu =ku3藣6߿W1"K_ED,G~W/ L{WO>O Y_$PHs et6"9L8tk/%u-4貑moXaFR(yX ==Z8g1LiP~t0PySMOJA9w/npKm?~iR %WZ׵+9?4<0Km*CJ7)BUqAsR'>+jEY#!tB(uIXX$H%Ql%Q먜XqT&1F: Hm%[ wE%<3XJ1Vٶ|E8Bz )ʅ=\X%ihqòI02vf)v =8ft#t$dxcwYX~j(0*yM`A%Ԗhǒ M0KDGծ},=9zy!#MP:QRo =5d\䃙}bztV宯t E߀W]Z1ŠC9*o$WZ{ Eߓh ힶB;<{1ڟlnwCqe5mFZ;BZH,ZGZLcX1lz.D%sĔ\*TeՆyIAw'`IsEh\ΒBdk`+oݻ}sb~t&yw虀^FcD2YH&R/5eDDL Q4`p%[+q}t1t;vI :Wq19W{<.&֮M%zI/> ̌.rf/Kp3"'*SD*p%J7bi1g;Ja8҆ܤ> ]/ߚp;lxU-.T`Šz7Kӻ)ǙY.G.+%sdf̂;cYg ]"W;,[yF^LT<0-aǮ^m,;i)}'-m}@h DڶL7?nþIqgMHn+bI،#):[oʨk !pIh-5I9#Ь7|m671\ :7-^3&1>dV&6q<1E k6QuxfJIhfs'VO[t.,|ۆFy(] ZL}NtdSl, K9^x|ӶbC䌂r/dRbJ e;P`i#6ei"&b K#4έI@Qv7n8Wag;ZA;&y|frnH.vW O)usJT`G V[k)/9֔(aBc|jvtyLLryå O=cF@+j$>gEm'+}{ QY&m!NZ{׷QKL˹YS+|LwdnްKeY-C4ư J-m4; @2P 1wAbd^Qf]PhI!Ȱ<شh0%T8ڀ(Ӟhur 8M(rnSQEw9lVJHH!m^Fû9 B#el{O#p,z5>2dH%1/;%sݛp|DFW-Ob#SJXp%5<dg{ ?aJ˱,X G4@[I#J+,Fj XVrq'g!3^Nw`ߢ|iGMǤhF߬N9AX! V*pbAFКFEZ`1VYET6ʌ ;NR*:dZyX ǃ~X{+_z\a=.)*r]-Y T{v=p\0s Ie$"_jY"&UggeԼz߁epSZH.ԛCR Ig2」k*.YJ3ͅNQbai0/aPn?(Rd&ŇKJY3Z@m>ͷv(僌tql ;CRl&hd>(1"XsD+Aw>WKrM۽φ>fnKi"OG>` Zj4FF9$*%"uHS>ᐦ̇4:+ 3r( %DJ^"H] QW@.GPUV}WW@IےQW!I #F]X j){c0Q)iVWP]1JQW\E]%jw`iTꌮ^DW??0jK1ӰrYN3[ նB=ΰ: %a'u~?~GuSم3.)'}a~g|v@0`t"ptvܰ*YMF5-AMzVzUonĒ=Uo WtZ{hǏq4uS?V;۞KRmË|jSa7 BC?~Qg[`ʶļHiRD69h/R\o󚸵Xp7pՄMLJJ-Ѣ20;˜|>³>$&X4\q0zZݢ*%ʎhHdJx8*塨D-LeuՕP\LMUvMLc}~_g4p [t,1d4>"Js3gx9AH4#SL;IwXnp#f}*CL@~j2 |.Mx?ׂa%'b7cN7o^؇o>kO0h;Z̩sŜ;gΙs՝;A+5S3'r ?33gΙs;gΙs;gΖ-Kjci)^«2DMR!J --% ]NJd *T\  q|AE}C!X{uzF:P yOwb1J=NScp@ :g(@2B0*QYíN*䃏7줜̒^_x>)anO!.2F'))eh R ,`MIxJYԎ:C#l{ #&W@HG4Ϳh>CQ 5DpJ DQǕ8d_>60;m|>E|⤚W|Űd_( EŎ/nǶX!b[a1x,lDv\4^2r7Ym"1+g=*>R6LO\Cj $C5 }tzArQ]YC|ddqO9񀔷;5 !}gt?j/IxP'"icS$R]yAb.$%-뭔Y+J胴999#Ѹ!0XCrg%7#rTZ\ C^jpRCEwZ["QjE"ǡ,`"إZ D38'B 34U4huzEhjXjXEdcڣojh\yaZw$+ I)&j*W.zHxxFBsP蹘2\ >u[ 76ϱVKSY? }:^s5o]j6zɖIc;7f=!p"]li7m73>P3܌h|-gw3r\?•`t.Qs ;)4??ϧXaԖ[g6g#4DL:Gq5V"S; $Rq]ؐ*F)xЙWﻈԳo]>]]7]6rMU Ϭm䪰4#iZ15pn:?k J(WGl&PX`St&DkoLNyx4 Z?k;MMf\g!O?*M.:|w|[+[}{%^MG7 TL$А=&(ř@RL ːJU s,PY #uϜ6|=A=Hvtpf| 5nxu==tYeCavy<\; m#VGOL&廂w^%NۀzH-6vYHږq˞Ev\M\nq"G.w?7 wt-7lӁA#-awG?.rܿ2s>  S6)pL=7a2<#ǁ1ca1Ńq"3%NWB~Sٻ:wEMo!I ˖Ƴ^"uVȬY$Бy"hU'J45x n >5ߪypcY 흃^Vmj=ju㚥C Bj/mR܁s7lԙWG?uvZW7f 7~+U­_h|(5ߐybא~qJ~XhÏrHV"]ԐIJ32j|e›_</1x!#rAm }:z7*yL<6y-YD7A5oszuvZ&=[FpE-f]3ӫp2cq޻_LOKz8vmbcft5^z9]#_?_#^}P<3>Qy- LHw,PyQqu4UWpS@qk(] t1)UɈP<˔g/F},CYTRı\cWz#\H *,XtJ>tr]gt ̈́Rƈܝ 7Yj/"0  PF)pvL]yZ۠wt~ZhP>T>uֈduoqT 8OMN*H{<+^Ӽ?~d؛{;Q_0q{kd~M+P@vMF 儝\s%9ܟ$^d,$L5%7_-"1 ~p7Y eiRq?lgq~wik6R_ۛ_j>o|~2͉e.if&_#nըd ޝ'͆g#Y~y_x{54nU1-R\دa?v۝]q/WӺ;D ҴV+eX2@YEH!G5̻h8YlvEA㪌l^uM6=t$f_Gc(28'7w;Z%coEen[N2;A|SPU"_~B"ȳ۳gopF>{_>rO^RM ǃ,mJ--vXDɧo Rں]_h1L|Y;*ᡞ'c>ۿ-F.zCs$u6#J˘I**qAP \; %wFS{ܥ6zZGRD# 2&0'齣4 G4IQ32D< ݡNCV+É}NW{+=h_{t^/'~I f`=tB%T" _YReC B~#@#uQ{NzVޏ~扈h^*^xBaiWH%dL>DPpҤIk8vdpv5_cHCcZ~e;X= ٧e/X(eK[mP(;vwԎ z >j1yAJB<0h=Nà]ӺKSITψ]eW϶+W\UHUR]@vF0 x>*+saWZeή2t%+Z `? ϣe?.sޟ_of>gzJ_!HTo3\ An j1E*$eYU2IQRS#ZfoZ.4f0m]'6s;@ǯJ01)(c2YhN/'@ivޓ3 I}6M8Y^܇n3Jot|9qQ;BGsh4b|M&Sf A%i6miO?q~RJ#J)*~syBr.@,0(/0։EY7D5cL9mQJtmp9!@p ׆;TRV֊ =C]%ZΗ~Uؔzi WǯڣrKr(>,}POǺ B@:@FD/OKœsqn3wR{)&"9$0k"Lʬg:e( Sբ9ó482r &m !ctFGFiZ;YϪMgO>;H|JoK (d&p( :Q;)>%A+PZ&X& |y+?{ ‹b pd9:})8{^ < ԪtpL:=eۑN  F`NK3:wZ0D |3Im#22¾_q\kI#}.{o,KN3kDXAPZh)e6<$*5+ RS9YF/E >Jڠ=5qtys4QBP;{4^MƳ?5i&ͼ[H*zpHM^D1tug}:qVp9B)n2!\c OCt>.N;?:qDN5(e:w4Q9pJ^ːAuULđ1s T#je6XS.C>RneNw;t5 kMg3 6k  t%[d6z=:\o1Lxєދ>)ףiVLJo.输9(~.7]^? M\0vpjPvu3k{Nm7ͶCjPfNwk{y{r7=G-ݭּ͏y?ob{:N/f%=swVC\b*]$]yȈ/ahCAV.ҡ[n|UiIU+kIUf+ձ'*TK&ؒq'lJiF~Ld:4,y+ wB!' !❅;00>%YҚh W6ҡ,yO^.3KneZb2-YrYF)ޙl3 GA΁ RK]̓wg9h8A("Lݰf(ٹ&.VzyD7FH%I1ewjMgqnߵ[L5:y[y<".~2T FHҙ̢<'Tt^6y.k8/5ɬ28B-uK)UDF2) dso7KϹIHՖզsd܍R^|k-x,@+e' * oUOŖȞٛ{^Mn&q^n4~0][,k!ʨK )I&/tiYHΥhaJMgeچ$ AHQĦ&<5$#\2=f]αx6vҽZӹ_b0)G#kQc6tRC)+ %qI"Z,'Ƒq!)K] ̥`pF*g2,<8GGpTVZ5'ij9\%neI +Bbܢәdɻ*j^F0ikCZӹ"1W?8\ٷPT"}dzmhrQ91uA5Ы܍^oY%u«XΪ\`_YUFp <7@|Ʊ2':u?9~c3zU]䲄 :+KiPDѩ2FAS 0|-G\LL^uRY;K m=B` uB\q<'Rmt9cJ:/>-V6R.$/ F̙1!6gZ-u6+}IkN/4qp5)/wꢊJ:Ea4 c"I G'F@-U G1cKPgqx: Ŭ/vZrE6i$Tʎi62Y9#cJX zc: cGu4dDIF$) Z8S^q!bTBΘ`1ReFz̕H2ۨJY& TL̓i g$$7_ ߡ͌TŮQur#]o8k8I9N&[* J2̻owg-Zp$xȇx/\if >5)Yh[Ri θd䒝-%O&x7')f =]=0Vc oV&2cqtn}O)xxe˽'L=O;tN?Mҷbm>kлw+FiZ1(NIܧ I~”v-6\9q^|ǟV0Lػyެy11ҟw~nx5=amzEkJ5#Qo8rUrk865[GymÈ>:J,p8j`rFOVc/[GlEnuZQ'6RVD>cjQqOxo4ة[m` q)na?%fË~%If7X $zih|1Kv՟⼋vVfVKDϖ91ɟID~Ï?T??O\Oç_~OL1Qum SǓ& 2oۇ_054ZZchP5W==U撷;Zf5nHc,vt^@eMr.;XaI~a/w$iĜ tZbK*BR,er j g9vJs[ p/$‘皊AFX2z@%4DL 19JD+ˡ/1,u:U.uh5x#@s꫖Ytq]ڀ:_zLU_Z<zFО.6JMs /|yk@!y\t^\kQW^+j’l!j㲎2hd" )ŲsƄyusrུ $rAe@~ll)BV'g! sr֚w[(GQ͢23N ' m.={y#!-?z()^*  =ԟKtHp;ުʃAq؛Rrb^Ӊ5Ϝz?MJA]3ȶG?{6?G-2a.^i>)3UbI_ksm׷m^iїՑp)l_dP\Ɉ Q7D@Ʋ BAvvZyʟ>Ėr9ABoR-t5WNmˡgAW&A€ʀ%mHLI3N Dnӑ]\w׫ȵ A?l8o0tnO~=]X iƦ%dr}!ꦀ| &mG/W%| PzݽkqE}y7즇o(ɎXv^XIi+RfH|M'Kp3"'*SDtJnpb΂wDðq I gUw] a~567"(6 jx0u>KDP|,wϙk8g̞1ǘ+uϡ |ug♙bc'ʵ}Wj'mD=H v;P1lI۶!Oו6yNz)%)%:M/]Ɇ5vHJ"J7fx&UxcjŽZg7ՆŖ6np5\,^gwLR kvUaZۗd1Ơ-S)ELT- s:;,xi*-(DDG_E_th4<_;P^*1oK|_v;3N $ITtF@4b{̘ I24b]Han7;ϞɹRc7b)[MUεb)#w^GtԞ+Bvwq}_f[8z"g4`+$T΂#"_A>bap+jEH 8 >I,DnQ1 rùu:1([; dxԭD';-Zd3<ηf{scc61O!ʴsJT`G V[k)#֔(AQ!1Y҆J7=¼5z>!i=vŲX3ߵ0e .$^o{fOZ}SoPUʷߖҬj.|1= g>Ԭ4*t<,*{ d?}*6re^uW#Ϸ3Xh: A Rㄿ^bS0btf2Z(f)W2:? )np09sH[G‰uXTyU.j!{&&VQR>zƌƁZ1 WL#,8ZI}e(SwW+0|%Y'= LRyW[ӶhC,S8*?HJsF+oE13ln#AYrx"m;ڪ=}#eʼnT"O,e 6/L F6#Jg)ZݽcB)7j<ZGEdQ0A[R.k%!RHDcۻ}$t$_Itr"Oޕl~9C;g1/X >z' 'uyiz;6soh \?U &rNA+\{Isj#y.~𳇟pX V\HL-E hsg L΃#*n \ܚKhn9䬎Ky+9o K! V*pQbAFКFEZ`1VYED6ʌ ;NR*:dZkq+h4MIe/zq`S,cYr1j?Cr*ࣻ[#bslq4 Q4z׎nإ;w.~U C5z&^|4>v03Oˣ `ŽaJ[ eNs@z K cΝvު@`+*|F:RN ݑL1jvfURk j߂%ȳskMA\'=J v ?YԪf.^ih>moz,ڛ7y5]<GC3-"$&b$1 h+yPFm9$ZVΖTxGDJNcRYUf XG$w@;AC\PpO0T7e}3}I LFI 9 ]%ct!NY̳`^q3tj.tF1$rG9)07[@8}E=#HD4*=u\4RiL -ETF H0HG`UN*kykZ\8{d_=N E |k|48jtgk<+ӽq?dgf[hZEwaZ"K3;\DUV+0k 0E JA)E7Day]6[  uXO& -,I֓esr IZCa8Ic,# 寧YwF!Bk=86=q#OnaTak4vVUO.Fo*PNb41ƋY5T삓(?M'-S{oBM=){l4vPs7K Q)`,&->^&sp9l#W6:dSMc1٨Yۨ\ R5iǻaPf:'7$;aPZC+h|Y(8*``XgnpuL''?Ïw'ˇw?~9D/ށ|IprQ $$Su[]Cb6G]O=!7~#_wK״SH3:(7"p_1=$}>Z;UIIe<܂ ȹ^.t[8DၱkTLc{Q8긓, q’(=3"jP+0UF1T)b+tHtt,l1`` (Xe[Is+\XD֐,}HwKf;.I~;ce{lY:׽U9hO(&n.(_X%_o9ֹG3ď9=F43_G58I1EHgJ`ۅ[(lEz!E}nx~uq J7eS]1=^^jx&g Jh׉U UT'x_b~ rwi]Li*OV [jgwݞF%2(Pz,(&g^"gDf>fҴ$AePw5|{0՘݌9W 帨_O$C>ͯ~-!,+$T΂ 6wHrρ$q)(BZ0i!f&2Q6)b `E!Gh0[~wQ>z/ ϙ{.oOB4o7V|n{ x61OϵsJT`G V[k)}֔(AQ!1Y\FMϼ0omlvw;MyùaCp~`K(v)P|ۻcHavfG0l8գs-c1S.N[0tx LqY̙C:<NcʠzCUG^|hv: ZGK!zƌ{jd*FDz4‚SQQۉzwŽ!tu3Iv}Q>LбT/w%k=mz{zk>â:U^WQ[qчha9A#Hcnw38nGH#:$F:qH2DK'=``J$0"%ػ޸rWyJ5Ul؇ټHH$c)$[?ܢL̨%潷x<<ղ@K!E0t:zhbK5܋Ѥ 4ׂs)!Qis!FO@˽FBuNӖ̗K_=~I6N˧zm jh]DŽ/1v\>s3wWNW'frh\X5fQ T],f's#}3eINMr-TPV2Ԇem6@v(|2-+eJ\˝yKl|ݯ"Z>#1ݞ^}ߞO9*&jQ)jk"ث1Db˅BJLSJuL|aܨ 5aї`YÃ=xXwv{8 UېA~;jkf[ 9:ܒW2KG Y^vNj"\q /5Pr^ .ΚR$tg~6h*׶$DKNp!P襤줤|gg2U0NHB*@ Ynl V]IaA"8θ9]1?׆Fmh$(T%kc7q>H- `$(z #`bE,rLu B4}xI[/FctoJApB>;HF;1]g^Mmc齌n"4<*گ.ekM6hUE]MYh=;XCۜ=%qӉ }fpMm-/uY (qC\@Ack(%5Jr@gpՔ 's,bTh%|-k%eI%ˎ;O͏֝ݹeGOB8tgxF}IYtLtvv¸9X>|?K}Mjru}uXG!ܼŋgLDNl=a(+o΃ț^$duF堆"#{XtT%Tξil#jg5dr1cÄ1 6O(pEYdٰgkCuq_"bPphvj=/FHr,ӻߊ= ?ujxP%a[I1BnPsҸ_ADeh5qPk*Wc2?oG^J嘭N`Sp`t@HX1+ΡM`&&7pWQ6q`F\x7k zRJcȑ4s @ؖٔ.\,\ Y&>cqa&rw[h!n\3{#`k) MSU=IN- 1=>e!zgx«dCw=eYܜ1a~5L[Bƽ7.Dɘ=|]Vr4~()5V4^\M!ja􆧥QRhRR6lwb:FlS7WĊ1V, WITEXReϋ6=Adc KӪ!S"![`F3 D+I^!4Scsԋ(&0պOLĔ)\cVntN7;{ TLO9OIcfQY[Sӌ$TWq;\0Fw,TXnC&zbue>R.@ @0J_9w1oB%W@˃jmr\>frsT }&뽉TH8 x@"09fqXOj5# )ݱ"k?ڮMhm Ӂ/3K%?_K?mք%h]bZC[)s"Ȕ잒G*ݹ5%Ɛ1Kdk%PBs (]15i@&",|C2 HlsN4j l-s|D~pDt3"ΈCT&9Y!˞\+kJU F13 pXtwR l[3 H 9  ZR4F%eVa?}}Y-:Yɡq1 anw%5Ob6lBC'T"g }n \".K<f<>Aௌ7/D.ty|/O_ÿ_|Sqr]8/x _I5;( V7!kB(TÐpJU2։,Qr0l|ER{ ^M݄B6xR{/ ?aԘ0 gߎ=+J8F|SJKCW[B!{IOb·K?̂%99(;Bp!--@1.WJ&fh`,W}X9Sa^ 6>9%4A}I4[]/z郜zFI#xuRwZo:gѶ_?<,~ݷcq,zxxX?*AjPw!}~`]{ 3~l&=L{r~ՍMX=‚Dvql5^L @ E/zPΣl- yF*)5q>H- `$(z #`bE,r<ȘZ&j9Rsk}AS C"0?C4hj^`> ́e 7^ _.2"0që[Y(_Zvp'y!1L]0ufmB|ߔB-D‹AE)0PJB <<PJB S(a %L)0fԞv %L)0fDB S(a %LPJB S(5%QSFُDُꫵ;l]w~?z{GC襘\)ɔ֛=q OI{?GG3GD3cfS|.|M s}Ok( eoV)Agp &c΁ol9X+) dLB(.]v.QFOD'G9CSR:4ѓ蘲_;-ޚ& _r{1_j?u AkC$jru]GuXG!ܼŋ,S(5gcN O湋yRyՋDlHPbԤʶXtT%TNil#jg5dr 6 +c2}FWЯY!dgúpPs^:/R)VABI5HHP>q=/FHr<]` M#NCKJhs5#c ܠqCg° &.j ZjuLg*c&;uM!A hbŬ:b_PӴMfmb yǥ\}2b/_81LS r$F1 iT0rk{#3imڟMJ5Ơ#s`J\R$JdRqKrHS)ϠIȠI"~qR mT:jYtځ D+%q<H"VHvkSh/&b멐dk˗jݬck҅S!IQI)%3ZXz+ʘ2a?ܖXBO7*> fg(eſW"[ޛ~BIUڃymQ3vR[cR;ĬW逰 sVk/;r:nlEͶq#S*bnǽbq4HHa0P,K`>âmXỷ5è%# z[ak\D|bl|0@OƶM=~''+sZ&}NAUöW wjĈKZђ>f{4(.+AJ:^2GLBvFl/\m wcRvp#%rZ8qr8>=M 1W:'ӗu}&34jt1vmM3K*LjdMݭ-PfRʦ崧,2{ swyK)vLX=Q<1Qi"̓7=8vQzP:Rń%*[\n-x0qp"_]rKXaJ_0S³-PWwDQlqЖg dpU3ne갦&e6 ~ nR^) Mꐱt<VMƝnE:хtK_k88=Z,zʳ#I/.q!H3b1(KiS8)SF,)VR189&^ 7XCv\އ>"qhb󪽎)$O]{WS>s`٧lr}ƙQ.:d)%sҖFULYe+-gtҰD* @4DZYERO{BN` ܱVRNq@l=N; ha<1% 3Bs4v,v&H/SV͚_4|n\z^knRe"Zg}j}.V^h)TqBL/H^U0e^@h(^NR\q-%c8Fǡ׎WgmUÏ黍]9<|9,c?")!pVZ%$Ns#r\9& (XT$nz*jPz;.U\lˍ%*Gn5N/ _?R 9X0F`[ӣV Pte8%m=JmqfF}(jVj5Z6v%B8__.`+;E]qKL֯i%U SD/btp:x~5@0贅 C=\:$UX .VװnwMӓCa4Kc,c/ Їl;z HnE/m~Tr3f\-mlzdb".MpN&' 1*͌01uQ5O&FMu?Wo^^/h*Hqn>Fq88=V.x,O| 7_a`dHm: C,[ YF̟`*S|<^7 =]98L:*AGm&6j\%e GیPH|?4X$qq1;Vv E-/?~; Ug)Ó~I |5OΫՃi.2T*隩 _)F .΀Hyw_o=>L˿O1jh*Ќw9z7.2UŸgȻt3v _@E}&*~V?)KoJ2BUP#-!2"z19)؁oE+IOmmhK}8y4J2'V0gT ,g0Ay%sy{_,A)d_<n>D"g>rtAɕjv%秳EА4<0Km*C0 q@hAC{$=U!=*3>2&"4 pq'AYa^lo@DQ먜bQum؎ 2Cڧ҅oM_}lM^'#o&y,es[.Pz8:eNStĵfѴNfaROw E1Q2li#AY3"ЖɞM㽋&{7M"n2ZRE(.#˒``J$0"%pG-bD!JqTQ<ŗPʰ, Ƹ#aREbD h:Og-.opg"&od8-AE/FCz7^ܑbZJws_p?(ٯ}]9][|*X|]EsTZΗrvVWB$~hymulp+AP!P!ʭSU$bAFКFE:`16ڊ1m 4wFm6*GT KIw 阂;n ^)g=:dz0WoZcj)l^vcÔJDL.b@vdWP`K3+%7D():~ o9ZڒC.LL0&˖1* < e8~.@V;?0#]RL>.M0|@wY[FH(*R$+fx2]8OIsM.qi62=c.=|G_ @(]pTئ~=8:̓ڨ^dKMtp=:_[=qٻ"Hpa&IDIJ}igY[ͪ\f`k.PHx3A=E NVF ԞZa)!r]jÅ9 C>\C, T!>`ґygeÓ`0ɥ﮺M-}6/ǓqKD3yqOcL箼3l9<*c{3i;oZ{c y<߽݂vl@x5]z/4͸khwp9hѪ<ϱD&j8hQRbGhQU"XW@."JTR܋7(((ֹsֹ&[W.fF\26;BVMo^>I=צw濻w6w}WtvkB!0ZB^h Cm[rP3e_SהfE`V ,xi[Onι<|D0>M. vL\C XɽMN"19?D"ZȡD%AzJfI͠+i?pm2/??q@kr:}uRxPqP)}AID܀+GS4rxxbqX\Y1 E5bjŐBDpg%QP'y`B4"v 9#KSx nPcNՁ?bB&\!ide@rQ]9|v;S8kD{}8F jf:Ddm_5؁ WP 2v5ȰD* Dq,V)%1Lk%soU ATc!z7ٝhr "`1wa dha< % 3Bs$1sllΗUCʗŶ˗5jgiKOP'2T)'s'r}2=1}.堰ĜixL+@.#XL+@-' [4(F5_`BҢ~#eHm[C˜Q9ǠHcORDeĞTykq><0@H.hYJN8_Ikek$n^~8f,)(ݽ ),->]|Y|yŨ;cƲ2,YNS7g?~0l2 j4v%BUep4>VTtՐu V {D0ktr!3+;v2: #]+aWi @{2iatH\([ ݧ:JS0 aq1MR*K\okSEl?W%:?*RĸԧѴ,t8Q|Z/vg8>ןӿ?t>̪m$IN]x/?kko5UC lQޡ_r+_%#Et!G}~m62JQhK< >؊~BI֧r*(NXz19)eGߖYv1W0.E}@soy,Ei"*g:Вp1$(J  XH`LL'[%9)wK9 Q Ƒ̂!9.0m#c'oKwiMon!o+x5iSOG5mH7C\HGx ] M5jǢB+ l7eԱ0 \nfx[ongxXx%2AuuewۼCb5-_|{9;,_d p$r[|Y(ËQoۈuJ DZ ~$0fR p"KMqJ`H FENz#_8=M.63Y2Psy{q!H3ҁ@&I<)S8)SF,)ѽ#xQTle9.˪CJ!ЊFQ^/E`JS&>Ҕ@DXJS&j >ҔJҔo4jKgOۛ A!/i `Rb SA}n͍\Q+(BZ0i!H"eQ&r1po#4έi^ͣLVI5楹v}V ;>O|sJT`G V[k)#:֔(Ba!CC޵-_6y=04u@<ڲ˲OR5>l9E(%;]e[MSn\@ >KeC qR0tԚbf+d xELp8{{܆ۙI@сR??tesT|lϱ; NBR|8ԫl_-qJcf7!H\5yͶM-YjfBx權pvZXuyLbp]o/,ڈf3eC"V]Y\lby{z/h dHDT cƶaT =}eI(:f5ǯdnYNNR~@߃bbp<< ?gy𳧮gA:;K2 CЂbRAE[N(Q6Nl3dgڗ}'ೳ,~n+ucR'H|HIxt"mB1bSTD1 9VmM${#$\HlPH6K@])s @MIb`I/L& 1",Rbx[~ !zz|suu۶x_YbQ+fDe2> pnLsM7׋ՙPr NΚ:T{h~/Gʃ,ɲ^_]vSssNIٛM67dsV:Cʙ S5z_^va[M"7[,J„DvrC8dPLviPk4.hVhs HQJl^|([@HPFp6WŊX 1$ [+tN+\j)μAOMBs NKBs 6V? ϭ j$4cgn5G{jz:S$Aj3z^er@ydDXc5n?D`;٥3J'2҉38NxG:o!t.15KVP3\8j 's,7r qPN$](.]vܳ~Ö߷6Iǧ5=tl5ž/Llz0OrzٟޖxW>$[O!_?oOAG!:z/N-^T8;%e %r"`j`xa4xֳ ɪEAIZgY*(R1W kAwUHulMcQ C4&ԏ]yrF*Bb ';Xφ-@=[wf;w{]:D|΋TA% !AzǭE--w>u0)o4?KJhs#c ܠv0* k2CaP.I'y[*c&;UM !A hbŬ:"mSY(ጒ#v8#j$5H`Г VC˾fv6*e6:K990"ks^ cV>|Gdp[KG3MAVIt8iJ?#hVqƦrulcnv1zkEUSL!O@6S=Sn0Ɲ(7N_C}{sX8ŖGPX$xr㔃W4-yuRM64&RU!(\Lau>THT Ho] Á' zlO}ߦ?}+9 SP;946A%S*5Ό 3eL}=R6&X|rۜLs ZhNE && ı("7$C Z8N 2'ГGD-@T`EۛOk-myٟ-_όW]LTN% M}{AzNq>|[QͳCsi!tힺ'eWimgѓ^===^jϤOc u#Gi=|] ?#Q3?[:6Gw{o_ZÜ/lgGoZ¿\3ggQ}oxZ3}=5y3ik?X5G3mOƼoezxYM2pم!%}tWUkX$LavS=9H459cI?M|+F &6P.D'RQ:1첏vٷsߟ;WɍG,A52%TN/(!c*ı<LLhRɧROU qY$H}1*ur ͚Va|w::arrd\x>aqa@)VyK LX l #ٗPe;jqزk.A͙T[zkm9fHB>֖2VJ9,xMh8l9,*l088v/j zs{ב*WV展/À[lӂ˹P- ![ r]:Yre&h,J5?ĒmN/Cw9n6mE\jlJ ]B+L4ƨjYgcGIDZVV{FiS 1*Vm⡗Y P-\ UIGqi=l>D.<"CpE+QmMS:;5Qm6$hvaoN}ϻ X[D7["8IN֪ 'J%ZbnFRQ,C-b❅B[BNVQa Gps][9l9-G[.6K&%0..vqg%mXi~a*@Eu=xU=i(skmv%a o1Jsܕ7g?(1D?eqPɐF%穡;MjeT$td_hp_A/o?~[Gq Kkؽ}o3Ѧ CvYNq"ߊd,AfDEJ$1lES)`Z\Y7bLy+V݆ᷯ#pPKUE6Y>tJQ Mnu\WY?>|"[pog~ߏ@/0;lH- ja#H\+bez.6TӼϱŔ\S 9G GgZܷV $lMvRɼ><{GQJ7|)[FB?gsu >^h#N;vL eɔfc&n&I~K/o!t\C)X$͆pSùҰKC;{F!'):sJV/*bL}q"`15w\4E57=4xֳ ɪEAIݩ= Tl k~Dc}yիcًfOݟVE*Š H( ;nGT(jiVg4?9d,67L^mIXQd,<3Dc=i ˲$SrUdUȲCOydFdwU{ٲ4nrҶo3w>;Phi8{Xpc(>A#cW17>b >:6(_x̷W}3}O>gZ}S:/D0<~0Kt>N#/?~0j<. <R+"Ҕ䢣L3IQ"Ax'g݆x;oٛ'5DM}"\ wHnyμmgGhg\?.г˄7 w0Gy4?>;nlf̟ҥ[ xQ,}gz>Nf{@4~>'fdF/ T5čKRƣτf9K8oY7wmؗI2Q{+0ZڀYgAE'nї#P{28!Xy)J9fMb\hU }B3xJB9W3TmdFW,*հg +_Gk93{zr1ӓş5h`2}ͲdVF%5'ۀ,@^ʭ8!9"E⤕hQ0&&mǬ\sLEckըEf!^11wkq}Qz5اCh-<&IR.$ʤx,夨Dq!) q氤*1iV&f*^r +:dk0XۨIq$|J|kT#g3vF}&2`ڥR㾈*#"#:;nS"$Q S(D>3R0EaOUْI.k% )$BIs#,iAj7UjlF/lj^u \,"vYK2.{\\{ߠ@`';{΂IIF0E Q3 swq7aûc.~|GU4ɫHQhNqc,!)9[ l8Vf4̢Y!Ws/^n˕h r&wMCT{Uݩ~Ұov=^E6IГJ;g)΂( Ee@D )K71$K[vΐ>.x-z)d"y,2Y+)k#|̐%iwd$ftysK@Xe1 0FXa 6Xu@P.[IHɢB' 8K7NGify8v4׳'ߪ2oSІtZf#&oM12 7gga' :h?fdJohO/`3_~kҩB1ųd/4CsHSjjx$`Ƹmfx'8Id~MJ-Lb3.;c{GFKeI: '{)A^:qoY@| X)ee"36N. va49kDŽ?w0ҵ^O::8؟_Uf ʸd/nӈ\$cJ;>q\"kƇjT6yfּUw*f㤾dS`OGdzre8hy;bҞ[`lH֬ #FaYLQOe.>\,zr9t0]39rT֏QWHuRBBOq,]OR3M}ǛNv* .aVʏO7YS/nK$!;aPClph4>L8o_Q\3UM&ΎIH@yûP~w?|8wGLSjo $ Cj?CZCx-+5o1b5b?ׂ0S Ċ fđAwI ǮU>I㭵H_IqoVWn$zdN-SU*AG##(Yކk=;~?IJ!&DN8GbD&I2YA\PʡNU/}v^FiI/+B\׫_ҎKo~*}Vi$W [0ZPHrgq%d>{I<2S\mڸ* m^DCRD6%Cv.ۈR<ۺx^Zyd:R9 2$mٚY!fnHKjYO H/!IrܻLO|M;E4⤝gx1~9ՂW*C_]Q6nvLڑeGӎpKi#8JwtF$vJ)C iґ093FЗyf:vM6t[iD/Ӳ]^ެvxk.m>e@͗qj.Ѯ|ƇԔ r͔ڕ :^\6-rݛnU~n:4}FOv"}5qуEHeyr2|oZWmжl7^z&M9.tuMkyx&>1lI JƤRҭ5Qom𼵬˫<+WT<`o ʝ)0,`hBtFM^/\X0QKBf¢rlfIRw<(M(v /RNJ  |h3N L"ӑuYq\e_d5rE۳n{%=<,dV$̙7?M}p3Fko%3OT<]f6h00h9=XԪX.~<="< "^q =0<0xc R_)gR1kɼ)y /2ϚAp/a]URGD0X`]0P$Rh]&p QTjlqlIݜN;kwyB|<">uw{oxPzWSTX3~n)~'/y&(.6UBb5ّ"ȭ&;]c쵏#ϋC IH& P\gyR<rd XErY:SyI}=}ό߼3AZ-dK&ZHRxK˄+gjR [wz]X!5 m"NJQmn6H ߁JR /Ev`IY܍]@V{C-egy% p2c;+8Tf1μSW"j_~y_rr4VUިwmJu:*R p"cIOThq ^$mVw'l>_޹f%^W<|Ԍ$o7x6Y 6hLecrAݩ=x*'yuzc7\+톡Tx}}էmm=tIO7}HD^-iIӯ냱'͝_f/jv7-N\=zo~7]Ok22]Y`n]>>j GsRFÏR[+Kml-Gr|;eY{|e2&'*/M%+;K |! E*UȤH1%IĬKz%*DZ \$'0(l'NH&U- -CgspYo^4ܵ9=b8aC0daw`PzlX! kڨ@kV##F1o+(ĻOYՌ#\dw_0x8 ]I yc8)"4:?R5+#=b(muY tP%\a`*s}@u2q]X!BnjiM BV0EDtAo#&gDYDZE]vp]q/ Ded,+v@}9~T2j$nm=b@=!`,o`Tcy+Js(B}*,o_M оf7kj~Q꣞eoقڙc98$UadYʱ<"-=ii{§UHn0+ŞA&_̌dfnN!%p! ºYҗ6#\fQxƘs4mEk+"L8-tI5yu;86üYjlL46`ii}3Zl>]O>4%v0}t;|;LT0bX%#8D'>GIŽRٴ[)XJHX. t&Yq(>MN\%1U̵0ߋ^I^A.,uqNQkN|7lZ\('Z P~b [뒌]> R﭂f)W5x,Z4OGHbv8 'R0OBȬ| 'ӦEڏRYHvx9Ua RIJĢ|)xvϥt#Pɱ_V5;qU5PUvۈ⊨t+W=U!`U!W\U{/ W_ `_hE 2Vѵ(j;_= wSK,ëԝuhyA'[u&tr Һ%!\[%=}c(>lHLΘ '*PIӚ|Oϛ)3,W"(m "ȠKe0M\yfb*} [8wNh,,|o[3QĀ!c߲ټ5rK@z!/%+&J-G޶7 e~?Ӿq̻Kb~ps c6[CG4Ŵp^8Khy+(n*!YE)$r9;drĬ.>QL:l1[h]fٽ1-bw[6uws!-ev"O P+供O (;Wh>1ѿÇ[EH2%Udxa xf!.Y#ܡ8p4ضC & e^% ̸ơf4;I2ͅ"u9@V)vp)J3ͩǴ&vcu ̞Qvz{n2{VAc4&GNӺȊTr-0f7 JE lCs;Pr; g {#P2F@j.eJ0 w R2AdKx !^ittyp1 H;c! "IAZ[,(0)*XQKA˄Jub-1~Fy~[vIkC۫*8Oel~ORӦ'-Tabƴ?lf\v$@QaVcNf{#- AM-U6LP=wh{a&>y'tq.͆f~gv)D:|VgǛ7ǤI',UAO~YLgV.Oד:Z|Hغ^{\ _ ,/wIJ1 |,Xvf6Ed^9lz]7V-pZu\i!cIoQ(08# {jXC~Bt緃]̣>/ I2 r@Nrg Glq/4ᚡ*Z?bÞG /HIYb􂬍+ )FV"-D*I /\,uZ1Ut@kk 耧vu 5tja k{VG~-3,KO_L*/0deUH#YLSgYeYV9' \$'0() $8= EfƗv@Orwb6clNV!pN{_~B@ִQ1֬(1$%cVAoĻO8jFS@^nd _0x`-R|7ϣXVGijZE2#Vg\ȜHU]]o#++>CClAԐv&xJʤY*kfPRBZBN6&*TuoG4(?4(iRc&gG4]2xN/L.>K^2<ٮK_qxnTP~ǫKrc}-ﴬp9*Ub;cm{ĬKdٮoUzg|*70Ͱrf=$4zy\狩AӺ]3JH^mlr^hyxTљ}W웬BCZn\k#]n/e۞ .ҟ%>^%s$8R>A,oJVVS) R_}7 ɇ7?/{ ,AT _:?:]̖k}_0z΁;{nm&mwo'Vzrxwyj?xF8NyuMcjFViivd+@i%r]Pf^Ɓ q2<Qy/eb5{JTSX->t*`t{WAh}2M3^:yT:zO:%{"Z~M\oTV+gPDt@ErDNXVYi : :򍂎^\9)Ih@ 2Z5,:e<&(XՌ5)@FQ2j)p՝٢ ʅLYspF5A [GQE'q:b53*>%ȃ%Fh,6#JUNRT"'mt+W 2Zp,ŀ& >EKDUR9rr"2!æ$sV4ܜ& mdo@|LL51rK%) 2a A2ǔv5bRۘs lA}-|f%AGHZ =׈ kX+Hcȕ .92PmU ME}bиDˁ(ho[E8 4XRvFNf/RX qt%O޸\B})e+B) s@r3Үfz+͸+XF,|P,:xO_i_7p:MآXc9W(*5ld'Kgl ()Pr hab6u:;t lITDruV܌q>;.`Pv#}L< ueIʊ$fl. E`b'#a1aȺPj"XIt>Y&Q|c.&N59l87aԟv8~McD#"xۀA A)t4^1r&gH/2D)SGBc:;1d(IRq I,Yؓ%NtcDl&͈xvD#d "+YtFXF\$JDD`p8e6\aN^ R35>M|y8DkjRdO܈45BXͣylؠN37ӐZG REeXNpo͎Or0\l".H“En'7E`.4uo'ê.oQ]uBATV>,NY##Z|sr:+Vuۏ1,_&5n-26};q/G .wUzF*?YA>dei0CG*5"+{JgW,;WUZ5x"XܙbGz=pL  {6pUŵ XJ#W~X`k*W,*W.|s_'hoTj>|w@Oxr@#{ 5wL~w|ny4r61Nz I}?5?5;x60]uYZ'aJ9i' \x_dl7M*y:oy;ؽd&%o:?r>w79~MڈΛuIڄSl?B;8VsRg&oG ,R5;ndۤ78~FC^5|o_~SHI]:D袉+`j^ R|f8-mndj:n8= O\'b^M)^32ȦGS]pFw{@C%K/A^%ܾYvN);vs@%'1͓ CxS++^Z @#1ۇ݋>lq"eMER4*TwLP(K UO\2/<[܇_L&͝iЫ߼/bkﹳg2NOWb3pg+y"d2k,yh?7/j i?i*CM|Ud(Duެ`[u~r#LIqd' Las0H^J_$2}nW?YU(F2SʶDɕDK-HK$i*YIorhQ9(Z[s,:J!#XdKkPE> Us@>Es:mt̛qU竫WջS_Khƙ3'|#%GiQzGiQz-wlGiQzGiQzGiQzGkGiQzGiQz 69%81#M]_%P[ܹ1p+FEQXiDZ,/mK&z˹]0<ZIK ɂ#EcnQt): eqH1%6(v w5u(.yU9ެ!r&&Be"sحn֜nuݮjb6w{{NE;GOYUJR&f&}1{PE[cR`!ge)ʑHڂ)"KP&DIjsb DRh "OR:mr5'$$'^j՜2@AFK9KW 8QE$T*bEY)],M f&|t;b݈%-ؔזB*>d iC Ix_s[Q:VخU12(So3hXOzuW{uV4{,|zՋu_`D _܅ҜȃM[-SW"tZk=coX͆Kw\Z۔>\=w|u2Cf_!dW{{PCocNy\,Էz{{xSb^Ns 67_=Rqz16,8Cs:WeeR}V }UFW}ڸ~OPU7[綁˞= >!f? %I:5.1 bR2+HXdt?ut"k4A1&ZX޹ĥ,rB.{k}ɓ'I;~MPu gXSHK`PH_tY4 l[ْ:)E.R2BHTZq#N2퉍 YsNHxhusH{k6N^gdϲae?ȗh<'`|I(yPJ#Y` ![tB JXhRQ zrHڦL>'*UPy0HFf9!fWiFKqR#*>YVf<ܳxn׳po t~6[|MWE1B(A*je?@!v0 Hd6SMA wCu|B NJlaN+]FgJEDKh՜b j/EmQ{doykk)+rj&jYbPI6.U7d5â1訲eIY*O0qd|HEW|l}k1f9S_㮗Ӕ+0Oj|ucDGD<4(:-KN)4aV NgSѐE0QJ9J }npEF *Y HM6`UٓLK2ֹ͚s\e~Ձqq\:f䥸hqqŃ6ɤ;I /LLtd[wl.Z)434DC)AF\| \<&Qba2U->bqL[T4 ;4lmaoIROKR&6m>tzɀxMNDGcn,O7筩= _Iܚ]@$|B}e\&fkDj3Lbٕ݉һ*iKHmD 5˗m2RKy\h\;]495_5='!̈Š%:YNA=6,=-E6qj>{o~Ç1ojr?K|ϵ^N?򥉟p[~Yl:O}j5 BayY? qZ/.}y#: Rmg  w:ȡ^HܲW-1 zr,+( +8 %(ңs5|ɓ" x船ԖsK@Us#2:)fУCƣ7NKQLwEqǽy"C͙xߚ~xh] gяHH &SnWwV ]lBZC+ttAu>J00i}MJg.gGkR城?uyz)ջx1gBp^Ķ"zseŊ q&:RQSL0e=ZO` @WezznW[aBT|Ό")qXuR:vR7ΉyH7 EԖLcS+jqCG?.v8p`ut bݿx-j8ʤ(ZJDKbdl `C-"{6l`c:Y{Tey7_r h,sZ'SqJU2<ܭ'[JBhB0a֎ <$(`A3E)ItZ'uU1)QƤS!6ѐK$bRN*jKv@8PDɠAP@AF̘3`|>@j}zlH|T5h]z?_+NE K~[E3++|6&׿O<'Zj$Y!6Ҩ &v:KۍTLukUu=]4%$PB^܈(*^*_jx}g^MFd|xX~iOںtO)7Sp!q?2-Wd4>DmKO6#/ʿr~>1{1NAqljӳ="d|y^|6XUyI'1o#NaH}܌w*q6~No|+$BxOxݻ@}{p~QT..v"x 6M5Mf4Muk+HݿN2mrƾ mcYXe7^NvmS{`j *TqU ],6T|bWӢ;BA|6ȲO?6{Ͻ%=Yf.l϶3-ޘj_v ǾnnMV=öf9tm+1-!03p]6{mt:)S *BtkO 聇 A(=p9̖ cN$AR 4EjL2D|2Rc˜4km#E`<*doQPڻUtz[%(!,θQЬEY꼌Q%4%l\o@$}"d4Pb,ȸo5ɡ:E JQ*d'U6<g}x2c齣__w(gbBh5W,@?Q^-? k@IDGJǨ-%76`k͜dɉ`1(M'l$Re 񫛉_8V\CObOAZK!CE^kQPԂ9n `p"}XH[C;oS, >[S|`;8[o[5a{opF 8l*{wZ8Γ”Z^6<S:LsTԭE+=XGo++atL`RX u@SBNXKRjJ /__iK˹bUw,[co+Q= ^m"wx|OQ|o!) []5'7'Xț_ʏv.aWt_Mp˽ K{ &/ؿ~7iد6 f\(E%$ %˭*0yVSsLW||N^X^4Lj˗E'3Zu4b#4,0GgqhYZC]#F)]FLj3yeL ]WI j3)gpE[Ӄ#,GWY\CPZ~pdWWh~fk8⚣aW( *K 8e P`UWc,C,I W_\ r5W($dzwr,pϮp5 gZ1!?gP.hۙtLqhIyAĹ`MOۯo-Ѹhǩ%UsOp[$cb(h`:;3Z:LBV@̾yDpVh*!*K -\}pe,ŴrcL[;a7+O'՛,9nS|xI v/Q(K(BIi,gaၙY+ a|7'I1cAJV !ܻ|㥔9nT o$qI2DR⤯N]\\o0e+鷞@տOGoO~cԠ'ۑ;/ǮRm )Nnzl/5"Rs4$+f+K+͡,-) Ihrjdko<.ف{\%(S~h@<>hÔp#:fSitP N$A( pi,(=βfi:3H()t2?:hRnuNsd\wyЗ˵NAU8'9uN'no@Pc29sh{A\=eg8܏3z[*=!8O*^ҚK&ZBUh=9[}y HAXL:X/v#ZnLvD/ IxSR\,5ZX" 񠣑 e(q[.2hiFPdaۘ8XW؈nn'Tm}71foC6u-&7iӽ6AZj:O`2NHLW_r7W'-OJ-}#yŐB Rɹc؎f)rgd9i'G`@(GUe~׹ s\Cl2u[m|[kSIGQIz#\H )LIt4:DȂe7ʶt MR`cDt'qsf|c"0 wxl1q֌R1Dm ;cl%Vanc퉵j6/>zqْ2WwZZސ gHOpTݐ(Dy^p#jsJD-LT'agRl|fwH>Ad2xdJ 5ѦJʤ,7s^u,'> ET^ w:GdQ{4ykP =%J2̟0޸pxsMՒDq\%o~0ړH''Jf|*n~:EC7ȗ}s߆ =ڧ;ŭ`*Q}7U8t~7^_*7BԯS!8mb;\'p5Ν>soT.wmH_80 Xb?-"YLQ$%o9bK[ܒlӶtfYͮ*>U,V񯽫՝7N^_f7`ѬL޽_ͬ3{I4gV%>{|j414Ҫ΍#{HQ1-^G݈Yato@ëV*ݹ|7|Y[0S1IWEVJH'@Ę?r(i4۰.]/>pRlQZQBVљD9\JQLNl  TDyisPOƧOgkCxk]CnUc/9C7/e~u=w" ߣ%cJ3F.2XX-dk`KNWw^~EX |(.ddzU,QZŕ L5s)N6A)@$LDAZt$o"/N@9TeggbcԌ=Bͯ)C9W--Cq𑋦\vLݦ /~?},BlGrZGtDx0t݌Y z$Ѳ=|"C ^mD#XrRfU pDFwEFQ[}[G/HߺKr;uХ;6j\o_}OJټ*Os#Mj4f%+'^2o$Z{ 2Woـ-8fݮ WpгwK5N5qs9]reui\lGDH+H#,%wk3KeD%̇奼j J*Ў!] ;'1BgՔlQEkpEq ϔ&6PTۤfx2?wfߤ=Wp-ԇ\#rCJP L %#B%,% 2Gm"kvH{3=2_nx ie;H qv'o$8yNFW_)O:} ^߿X%MQ̾o8mړ̃9:'t?r?ޏU⃇vHs/*5,=eNh/m)._sn}=⻕yk{/=|.oWH~Mnzդni&]ĭJ놵x&CB&ο۔;mؼq<=-V;`?OVK*ؤڛpaz9anOOIQM ?w!y0w ?b>#7^]VԁƔdl]-dHҙ$Bm ]QVdT_YW~X_}deG*6%mSF`c&t)l!!%Am*gO9|exM򝇫=~_}`zrP8vc-'LC<[ xլ/z~] p\TtJLEEN,F7mqMLVAi~ k)EZibd s* 랥k{JkI,AHWjpcIF ;/ti{ьݻ+DnP-]<5i z/f[W1bZDI[O[d [p+Tg]t @aMbЎ y8: Oc{%_/T4ʎ_,#z"H匔3/+-:06ә}@1~Bجםè:Q3$ #%o;J>Z)2ƺ䁱tʃL|Gt14V(B]UM\U׵!yЇ)WqnT=rʤH:͂<'f,~#}6mFxϮ`yIz- R)!)RH28 %c=&e>{c$%: )DJyE&H5`]cojM9ZաT5)sQK%Q)~rX]oWYY/c6J֝e]qRd&W%fzpO#$j+N($[g~~<+,E:k#ht(UgYN'%@r[w׃&AHfd`|T U AT`gb*XΚ^ 3wI[oJ?DcpI(#dƵJ#gOiQZcΘEυ׊z=5^)AcQW{1%beQ*L#jrzHoG;iaW52E4,"8]/A*2(!±3o}*6gR9t\F{eƥP=PElhr\$A%H51>y} ="L[9]3^5c n,|zSB{ ,#ZARB 6SĒ* sбǁ ţx.͏F:Zq]4-:AOuaM8}/7&ݺsz HM($xƴ̓M親N)t_ \o]BQ]t_^$%!)6$($ƐLm [PXd|ECE~EmL8%dd< } Cd(%+ytgUl]9qj.j|z.)Cg qyh %#`n(u^%D2ׁ^+"mm:lT{^ftLD@Z[ri uv`; (EV1S;UlK7.cF*ȁvmmIГR4#Ad$yVrJ"-kt]!U}[A;;P/Ʒϋtʷ_:7p^k:Ǫ3J, RXT1Q(@eؚ Q}cŔC(Y\1 Ki}4YP |['B"gd Hs4l^"rH]1h 6D#Ҍ=QJ7?2}˓>1z՝TT=H/6om,ו{tx^d m]%v'쎮׫]Ϗ}[|s7/[I[!vwm/ Jx!uǭ篭Ut;:=o;fvlrˎ[vknݿmEww.GϗZnZ|3Y(Dꎎ%{ zzΚ4 (qL6(٘i1J̯C_Q6lspRJ<}lX^=mQ 4%oEJ,[Xc;zmv%b^1N{B=SfeǨRl BڒY㢍s.y0̭XuHB}HG!ɶ??1us:XP@r(ŻbN: TQFU9Y(|۴%dJW QB"[@V֟2J2,$^ihs^@j,ۻ>-[trM~ޚ'g%_UZyw^r*'X$Td6W$dZ[^c6h 2,-mRs.1Zi䢥bC XK*#e(ŠP`a%Z36#g#ͰJ3]،3 zԅ l>/O+2ܳ~lt8?M箱Eq&9d 5F 1G6LSZJ(m{(N.{1*'әUȾm' eJ"E5Z[c,_21Fk⎻jm3j>> 9zf,JgMVSZd&1Ebq0"䚪ӪLV3}X"XY a+:ik2TCr YGؚ>t-1#%[1m:9bJ\E!Gͥ+"j =m\,y0٣ gD!IJI,i-j {c ލYM1:qɡ:㢛pq}"b'1+VjQlY"!yBVhj.>. vg('q UEe?7kL6d~Zk7rk7j=`m_a֞\5ɞ \5r \ Z}nIhWWƀEuBpdઑKTJu [ WР*p%9jZu*pըuG]5*uWwʖm;}Xsw䝿7nW7V3<_}qz8S5?,OBg;΀ǔ&6PJY'F0L[CmNL0a+wňj\ْ9.~=CJ5<#ju21c!l1/(9~X^^:+ 5ZW*Odo߾1щU`rpC\BFOcYg_\֥m낉 ˋEf{r[ZPI!mz*iv "-eUШ<oqHDm Q)EG< 'sר5-F%Nkh<WBq4rU>hJ-WH2.۲뱪:|vWƖrq$MBy} ]ne,~oY^cW٘Tl)n b%˵YswݕުoT3[;4V7벮ۛfϯf7qYû?WTy \;9g3fٿ!%&RVdKYĔ ) EC5x[!d(0Ur9њ#ŝ uRv%?v[l" P,ɷw>bGٮk2Ʌ͝GPouNjΉfd 8Z^FǬ)ehTM׍ٖnjܞuPDbL *T%WW&PjA'u7:Mu\|MRsb:ɞ M:1r1ͮD[(X .YUk>*VeJ!Q:F/w^{E>EcnK9'>,3/hy@<ﳓr~2GM#0N)TF'szȵTNo^MӛWyz}R69jYNPRv|5:*Ɣ ŠThK -Fd wnъH0I8xM*=O))yhF/\~W[ QC;eֵRڄnE%8&<6ZcœvAvbI$goKnQU0JڃWuo:WRz_YU6^ 1A0r֞+]Q-_j5CuJ"T8QLg9FΞr6_m`T%diJ9-%wK!$#+nՠC9"H`:9gDB:ٔ(6`㵮)h\ 9 ߓ5,B:NMJv^V;FϘKf(3!(:~Jދ& fEDz\\KanjZdpm(nc 몧&UCfϓƷ`| Rv:5tS󸞚.n-rqq~cXa\*]!ّ`J,ƛ fJRצs1Kp)l˪F&h-. MTC9dA.PИX\S)6jEtZ,+:خ"ʦ|Bwi ǒU"ʊ*'_ZC@"{[&_k% c qQy\1J%c WF2*OY} :TY8 $cG$cW%-~)WǪ}ݲ'ۥm5ΠB@@wwu[/ï}7F uB JcOAǡiq;49494u'Ӽ-&yBT*l`M bM! UՊ<3j-H_ lդ(RrD01'1X3+k`8ZqgH7r*W|LuR>XGn6?o'Ďc)m׻C/.1wQD>Hr>Q.)BF@o jvW{{X8G89ȝÚ[n~|u=F޺a~Z,osanb^3 =>˿aUIK%}2 [Ls燕{|ɺN}kaRmye2ŸA,w0 C b[Oky&&0dS Azb u > lMXtZs.yy,BJ,JO&{o}d%OgޞtƠbZ$WQzW#hQ9T6BnYV81*߷@AVUՐdw \X'HQv@$G΂ڍ=2H_hE--W!Gb:8۰; ORsWT~<%9&kMA;@AĞ3#_#V c$mkuTvD=%Q"s Qg.jE3 `TodF1tnX؍3LXXx4i;Ja.^LuUucE&Y?͆#PQ,&v14m.gl,3chgL$t:^vB WS*yT9#v\y4݋;EmP{`$9͹rm)ZQԠCLјEPr}Yꆇ:)PdEVQ0Jm!8F}:ϥPD'D!wP V11ZšU-欓#DQ%i_r\"bV ̣ŢP=pFDX\qMĒ֢L:޹y?Y(D.3. '\|0g-Ri,vb& ʊ oY!ikO2&Fd/;%P"+!5%F߳nZ7 nf>k^y7uV҇/ڽnP~_4,dǫ(F6hsF*zd) VL]'h}Bʑls3]rϧйbe7" x#&[CRIhgxp^sŒɢ*C3bvQ XR%kJ5hﭩ =iI+h^$Bq6 ;YwPuUޗyY>2:/^!{V Oyigms ՛LSawkMI)EXf,/W _y6Wm1$5k3&b ʲъV#JSRZ)$HF̦R$l-A(;thQWr(lAPwjkw>`TӇhܺTnr=՟[؃7߹gZȑ;L[|? \2s v&ƳX٠W2 UMTU5CRY/3׽M7`?}>͏g瘨2Dq d{~݊ TYDpvs%G=T֩6£_`xqBS g}; ۑ+ӇwQn 8 o8@QD-3UN] R9B D([%ªJK}xA?IN e3x p8}x@~YIU?`&Yy@ۏ2myS( 22vjANIB?#E㖓!X`W95a,.D9s+TKv oӃMe DvmGUi?QD f1T#+8؈0nLEJG)iƖv mTL6zꚥT#Bi}0z)#"bF+CEl;bclQ8 HBNysq?z][0x3[{sMx:^mx_Y{\cJE/WHX3ÉGU #OuO7=IvZqZ(|Tb$Z%EAɿ)bHQ #Tl*`"{&}:fh\\ 6'%bi1g;Ja80ch -XN}IFftխjo&zR5ՌO^9L.EcQlN`v22f`8-e=-yuGcdX2E<-i$n\/%=`Gv nuTDn vLRs%% V=S%v72js5ͺ-j cD= 5uB4O !O (Gɳhٸ{am2[E&\b<`ZeΙt# )(/$"??碿a֨QzLGsZ1)"@3P(<0!V__g0{N4 0ǘ %Ysv6ܔaĪgpe/4mJ;W~1aD )hHXuX# oqhʿ*)C=mg%hp|yry"9~4j)WJ?Бޠ#F}&aڞѺؤu9:fu}Y6jq펪Vڮ[5uJfVJ] =s16s[@iTJ{sH=$syx}9\M*BWKR~w)]4ᯙ@Psnk(Ի|1֡ZHuv7v節Z**$SU_ޏߛ6ufj1QLT6=sz\uX4ӬbNyND4a-h!ϵT37{Y/xJԣڲ\i8o>r<"L Ȣ 2KMqhH FEN#fӣ 2B2WEg~&Mbx*(L3qR Xz1RX;[ K+0]G>8fn8N6ąϫKR*6jM䅢pQm fV8ޏ$ob29isSpAD.flWCKqD* Eq,VԢH :ݳso@GN` ܱohX)"Jgn ]6WHCsK [&ax$ 8P  q|<ncQD&rm"9B:6(j#'wU$_h|abH2jά܈\}{t OYôsJT`G V[k)"֔(AP!1Yt0ݼl "h5"H}D* (v! |7/(8ۙ~;b4腉fS*e\qXt_S<`s-:2VyU9U.>J3C2 ZGK!zƌ}j@d*FDz4‚SѷTɤpŽ$*ae%`ۇ?t!3 凒ylo A=-(i>eJ t7w>Dc "gFsq; u9 ʬ -"D0 K̃M{)È`D)D@9&r6ȹM\udXtqGkZ ش91r78+ɨ܎)ʾlݗeb~g~ࣅMBL!.Dqym%ʾ>MYzMs>&'#9 p95RDž7-lg ?~KcR,XhŅ8 HBiehC+EW>S*0"PqC` -w|nnǗUƒM7noj$ K% Pnz')!։RA%Q5-ịUh4V 62(krD`p'M2 y|Y`q gej5cNQhkV\Yi*wSu><*/ &cM/6_2ɱ6$Qs-9˝R 3/_JB _=p2*CzMP~sVk/ꘋJx->uI%eiG}y9̉ߵT/ RMݛqZl`^0g>l@~2ihL1(O+LjuB#ۅFj 2:(ƅ D*전("142hD#N\Wy!xlk7[`RZ-[BЇgQ(i2K$ ϯ EҌ2Bd>pT<~"*G){3:0)/z0ߎkZ"Vէv#̼vԪO zư/ @Yg\[ gO]CP.xPXcA. joRPjVUqW}R<& (vqdU6BO&yb&jR?* t!+ aDYr#̠J09`5fNIM/ ->X4VuۃNOnɍڀV" .Uѹ8+伱xjeT6unSH"t\3ͩ2%GPΐJwF)9wXdm%,R8l~Ŗ=֋eʖ=xlvzXXF@—BؔZ@d4cO#9= `mU2Whl1am&ΞaS l~Ix.gڲPOȦO"#|5knܺG;-JuyuvY rr޿egXwk t֘۰L[VG!/(c bbލ"3l3G;lO`B eJ)=9N>䂱!c~i4QBh,R>s+Er)>%&DbG_R{^ I} CQR)]`PPt"DIV]6m PX/"OS[ $Fb92.56f6T:)k'/^mvf߀|Q2MQyQd99^E JQ'/p9R61gKĂrm\>yds0cT$\|9Pv` 4 *Ͱf1 \^UxC=7iy_4x<8>j]Oآ1<(T2k&$ A)%)lm*P^R64v"zSkJ݈F1O[iǡjS⡯mjr2)r"ͺkf`G =6cJy³c'%MhcCNCdYtRdk2 KG( \tI5:YI3qvaOW\*0 o"n8  4%ebQ;,jk:gLULIZ Զ!EƤQ/Ƃe8IrH XE.%fҒݒ`S-Ӻr+q(x#|tJ{ʹP\4q 866ˬ)DHvtb[l>Y)4:04$c)Q\<. 6ӎG/5 s7D?>SC'|^Obo VԜMe"$>L'Io?p9}~1|\^J7ڣ*eN' \ߌ~?oG c]7jo!xݮ'v{'KH{K~zSEbټ0{pn/m'N_܁~f2';%k9?ʵe6+"ࣳh{2蒈C&]"uiP; y;Nh4;-Nhht4a:a7C:)1".·BH)")tKuB)S߁7}/l8kR6Vg(8R|$=RZIl15f;m&@CO<6ËQjX z:^S˸ڭv屩vc%R !_/.7*j!fPu^֥`R,ZiYUp&afς;{KȐ^e U ]~+) :. e% "8AHE֒E;8K,^5fig] ̬W6EOaK{ň挚(;k9T$ I+%HGΖihFsL \_8jASrI1^ʒ ւP Gϸ!@=i+ P>jj)xob 3 JDdC(o/fٿ̈R K}1˫S9RL'uZ]MQ.2ANS-0$gm\}nfKplO2I*nWlrA&VeDQ񾸔hk[b9$0f'dQo)zpY儩?#X!غ_+qv{Vw n6RS~UsDsj)V3Փ2E,P}v/ڹ83yGZ-cϋuHV"HfJ6u L'kAM77wo~O?]stq;wt¿FѣuB<'Ǖv qR*٣K^2#KV2 {;׻[w|n}FJ."^"9,t?"bow4|Rc>oG?,>-:vy>=Z]-G=W|F"I"ʨ0썸Mdo6>Mf7^̦׌˳Uڕçހ5p42لW^oӌc%{"C1ccXtJa)].]MJND1DBItUɝێ#^/!'&f)V+({_Mw40~I459w4]rdPG~3ݾGA>__&'ӏ !Um2ƪS۞dzƥ4nvokm/~1o;B{~𙈡Sh򋝏[8W*&2Iwd\ 9A PHQģUxv'qxQc@o@FE=bePG "J=D]d4>u ъ^<8./Jd+欏v#LVģ__ttC[Cٜ+ٔrTK[+/,D2ڔ!6|_9 Q% ShERdV{JS1J띅dX7l*r3GlEq@ِ@or+n)ƭhFeLzE4kvTfnQs\c껳I>d0d@072 <^̀,M>2~~}͌>Lxs]ZFïk0ST\d9Ȳ{<Njޥ:wOkz;FwRnZ&i٫MZ[&MԹGY֌hf›BkXC3Wu^|+ GeNi1-a8J!/PPdѦ_Ώ,S[fήӻ#sa[$Zһ1s|Gbv~49EJdzf-[Vn;:6y?]XBMܼqA+z|`ŃuyB-vC Yj)-go/~ykF_,>] 8k|K3ȶ~y*HjIxdXfbU紐cO`2_6 e{)C0JZkgKa=<0buXTnq@ᴌ)肴3J! k9梌{9p, R_k62cHG& $2 pBcNY!Eg97vb:==C\rj>!JX~< qwՠusVA>^-2%e)^1 Թq<3Y 4!BQd;IHlxZj->;qK'fUpy*4GU ]PQ#hW a]PZj2(Z\ 7gjaVJPɰB6o/G#quOYU+e^erQ@{`e(2 ag/)лcXG_K [p}iܛ0ĺFMӟϪ9V{ͺO!H3T,b E{@P)x?{*Fd̳LC1m@ؗ{]ewIxYiFt$F>@nujLϣY+} d%OiQqpU`&P)0kp%%& IlT(,QD:3"c4S-nc'&z2qԛ"J2_Ho'}ʦR9JCVSzL*iUig/c.ַyq8<MKAwU vխtcprwA9*!~PN(tYkɥ ^t@K{{.x~ yP/'mfjZJ'ErL9'jΒE_v' '\=6|۰ow( ߉6(|HFrp^DkcQs(}b*aȤQᢔ#Jк!KY XSd2p3k {?JS8Fkd5 KnlJ\~Rg{:9y2*X/3Ҭ=|,)|:8i W~?&n˂!{8O 41 ӣJV`qvn-]\PLa4nJ%pfkM5ߟ?kOܖG?+^L-eKŝFnGd>PJ ӝ%HÛwVJ_jxwR)tҔtL6i^ X#g[N ^|jsK\U={92hQ"i.FgѧA yg"S>02i,"m/]װS>=a{#JZ"@Mɬc.5n_د @I%SN+ϔ5RidYYkA2xuT`+\p@! 2EUf*O`b& ``{^荜-ޅ/5x7NCCV_`Nw*(pE[#FkI *Rf(k1ޫd4J)™depٲ ֜ˠБRg 5x#x40Ž'_+:|fQ bH{F*ƔÍWO~=)KAhErX{V |+)Bp>ם{vARE*SLIfQd]5,=ds,;+Rlx5|eaPwIç&oX"9Ym]Iޗ"iIX hH"3K{̳4EWy _1xwm] ݄o7яOf%;T6 F?Q+Ui_FRk̕jIB}2qID-;oJpoBsE:@%d]g\xU/Vd$Zqƒ#G}U?jPLGsMN"п]kG R\([w[[l[4ƴܰQ/5ݛBkm|(]Nt5f'o0Uۛ}"QsjSƪ0"iYX)dHT^=0-CwSp Nj~hQnpycΔ0:qI6}8%od"&' F'Y)oB('nn֒k)Ob3R :Q;A%r,I\Fm`@kERNb @#ˀeNSH]bg.ѳLv4͗с.b }+d9(ѱD؂ k2]ʄna$P*p§( 呬JU,CI22!B#&Pʸ.bIګ5(V6$)٬Q #>T""$"Ă4)+fPJEB<1])ReC"KF'Ɖjd2"`"qTO'dD , &$b5hēb$#cyPGCS}*3ۀ~ P1ˆ|,YLdI,CK2~=&]koG+>%XU,_fbI(NsI%Ŷ%Z7}ֽTթ@B]^j<`" P ˕3V40ͷ!!bKEE1v0Ts6J)@gZ9 }XW a3XEfr L#+%po]MEuF%R9k;Ef޾mY=`~EO @OU!B]Tl-skC^rJ"Z 025"IX\}Պ!Q*66Bji-DpC e0>j2a+"Ǜ ~\uXgLӄ@<5npvF#[d6bVGr|1|,5h9bG: ^es3KMgVi>l:@E6**Z@L.H􀁩p1$;`-ȗXP0F4DL iY!^`rQR0-*kq Eduh6yЎ2vp d`: (N6֎ڜ׃bF4Z,mer[<Ӂ e#s46.zcm\g 0,6PM ;#RF WW !ܠ0̹+ 1xXeH! c=H>K4К5'rZ0M`QIRhW jPrV6YA܋,ʺX@K039l@E}vdt,2 cmbHuKm0ZkҘ(5kc8n Ȁś !u>fds\@7X2A._ni3φbU~f":|٥#hnID a}ŒC'T\N&!K#Ns-QmvEC@Tv""DoaL^Հ/j _d7N&'+{R!yC1C().(.oެi7 7 t R zG/fx|Wݼ;߽ ڏ^(d_'/溏uے픡R;/';gգ|IyN%G.r"$w1!T/@8nh9 VV"V9j'Ssqnדw5 #}hq~ -0F1:(}\wu&(w MwϮwSk*nĦSUa٤\oT.trX78@tT&}.76+4`x;xEEa~P2+~oߘ].DS xhr: u\Ƹ:W{ӛϟl5O1vU)rV]rN]|2iFQ#F>!-ۤ0WI)c7vmPsJٍZxO-;D!,ZYע6*]v*1E=s[LD=$[`6۳hf .:kV={c0.qw/z^Oy{ 27=B9Hq}qkx2\.Em +p@j&mʸ{gek:ԗ83_+#o`7K~YCZ>}eԟɋhiG'xO Ėt4:kTr,֕M?k7ɧj@f_ŵw /\>Nau 7Ξ5zl ޢݹl\l6v;wS?|K+r ›W*A'wPO/UNS@=8-x7ͺ.jwnIL~?#9}?R# M`LE`34uTtlu/,۴mEa΃O6臗ˏHgɭj`g2l'k˳YIP8e/m+ʅ3B)U۶;Y}7͹|S9γJM#0zb[M{󝽛ۓ׳yEz{~\7ɿ6?"&TE?amlcYXD7Fvf.fjk(#+S,V#Eݬ޸5IX H.y@U%cEa!\بZXY60_OA`?=^ͥT'8z,vSI;]2xbtjA򂛮q>~.o˖qHUK9 -%K!rL;Ezs-1NqSsT0"tw.B :Nd:{b[]#zUq4(]+|?qܸ\/>5{ܸ9ӕp@r|k(į[F/^E@LHӎ{s(iIꇿpR^wQ Έ?j/[ٻAزge:-y糕 Eʲ6Ė-[~lYd"fQ%vs;mՀBrM 㼋,J|1hÝ-q߀ly0sg=;!Cn?peݣハs#./lǮ_νC]8=ٟ=,[ɟ>`K!X2ƀ 9H}Yf*DL6:1hN] ~vx"UpXV^r0h\&yp˅Kb, )ٖi I5$-$JZ8_7\K `:\-/Bts8l:=+gϖccWN6! aVKi,I_o/_FDłhOc>oE/'8Hcv^xX&4a*d#L20('=q^@Da.bS7z"H`J?>=6o1 <㧀ɛF: ՝-Dو_I"}~;sG˽b1 e%i]x6:`utU0"R2.>[lݔBdw3_uoL[2zTsP#X3>z`1,}+Ru\3)~i#z>tr«Ǔ\DP&5.6ԸV,Cσy{w?/~^t?h y֔j8  $)")4_r$1Y~]6g-"P&TŹDH)\Bx)$dfƈ7 M}_86`=h`'˸P}85|4y>u `ۣ#\=/;UIB(ޤ>%+&E)tB4ibUp} ( 0ƹ/>;N\d BAEÄBYZ`*Fa #r JPVd&d6Jc,(!|F'YCxDR"x36o/Va6*?~ ^څtΨ"gCCRJk gR#YШ!ճBKA~bw2\vR"f/eI^kP(g!0,B2JgDzГ=x2>{w׀L9بDDfA)leF,aXmc K=_^1{Nۜ&yJnYkJPKގw _BNdosR;&,֚hkbu9 TAZ$&[fZ7z/ܕŢ̷ {!Y՚jP#nDHd*8x=.X{}Q]{|Fy=XS8>kׯuMIlrN%d08iGh&mloJn1دjJiFfa}we+ Q `hXX[0w+]nጟITU’0P0e.J+mE VS11*7(>V5bf+#U)(9G7Q84Zfe }'k:ەt|>NЧqXϤGkK.%IC` @mԬ+D`H@_Ii'31 -|\pQ)M&|uDz(ً}HΎ链 h "(x"%ErTr">ͬE#A.hĶS&FA]u}X2WBT H[K&HY8,c%s[hfygj %b+1'zNi-L$hhlp8NfԞ,\xlxY>l_Wlռ mUCc`[": sOڃIe1,z7NCi ;rEf;J 4tuTK&k(XO 6b*"bE:O#9= s*Bu4#gHoU.dKH&xrց&Z|z=oG ծ;txYC)uNv+m=얮]OO>;=cIn|>Q PkS; nuuh{^|Km;o\ͶCnpfIwiyΗqvyfz9[=ݬ].4?ݔ«/_2*&K\ rz:H~ )#B:4A72+xFꄭy6t}.#UuGսO*/bqD CQR)$kw`PPt"DY#5Sb@z)|-pP|"]52 Ff+V$ZRόtv8k3;Mf}!NCzNo?YY=U*owK&8X u?(Y`qD,![MRϹ(|\H=\ŕ+ T[26B,z*(㮲lJ Ae;`-x_6e]Ѡ /Kآ1<(T2kIJf41֨8dߠ8TA^,Q*YŦ26< f+}u;| )H]lq:k*HR= ǔ5FL\HC V#6cJyҢ\ohcCNC522d-:),k2T+Sd8BEZΑ }gm:a08S{EwgD$&7>xpkY(zXŨd5YLƅPLIZ ԫDfȺ1iTEYa@!VK5iɰ$X$c;l_:\-U. 7,FWA'uϜB3#x$u,0R B.<:<|c4ʳQ^ٛg'rֶ,wZ!D[&XiGHt Q` MP&7 1dG!;wd䞐r}FKsr%F_gGI kf-GbHlbtd˼0K&~ߍV}n>}eTr E]}2f W(M ()8E0Iڹ~ԑj nAWh>K`bZw`N ~N??\OݡF8nZϭ*UrXp猖ȐIVḮA2MG4RNJ;3J>k/7;;ܬ.{Ϛ!F)=T됿vDsZI54-Y>v=^E!s Xe:3똋H%A')Zu)GG*z_Cv O?vl˵9?js1x]|I(n ,R '5 2Em'W7s*{As֐V:i1lp(]9=d~~(?.r#M4F܅Ch,cZ dsŪS&w-%%!dfq1#k߿RK t GN*!,k1A`r8|0he:!Q 3#ŷF"fU\Y%mt!/ʹlРJ([ K K̔ AݷFjwla!}vJӐoF-\Zҳ Cހpx[N sj75Qamt&jnxVX?0㼰)G7TjMxUj(!NGk|N'`~"䀎}|AJ7LԒThutt)nuxC!kiȔGWKk;DЊEm\쯕Cohoz9tvizm/69qT*f`Ԣ@}?F֣Ejb[SO~i=?;Z~n2QI9Ѵ8M7vt)nz?Ii<_T|Y9j$'1TxGy+#^d&n<eݬ4 jRxdxƧG$gw?s>t󾜵x ?k>n?} 㭆Vwuo\բWO7u(ո$""f~)GA{U.&G܄Xaj+WlrB^G>-45g1k4I$kʭEb JǶ6KӲϷ}8yRk Q >h0QΨIq$B%;rwcrfs~SBRvEU %⛨&E*cöiTW?U]w&I<_&L@BwSPgw'CQJ&L⡅ -Q..~`1~I]\/^lA%WT k+2!PH8e]ʊ}X:zg?YyDd̬&^i^z7K˼UXҙ/S&(K~KRO `4q`we+/|ОQuJߖv^C19Uf5F1_S`h >nR(`e8us&t(NB^ qQRC%qo!6✻JX#TtgBtœvROr9uyhO'YPW`җ_?cfʃPx{9~ot);'B3qk-]v{ͻqa84\c}N\-vO;u᧖XF_qa{ֽ|͹mm f!BZE@#-hѰԡ`ɸ|ZY-Zu{0+#W?QPpK)` r* 1\%knp& mk6k.ȝ+6yr7.z̶,r}rK7e]7MX1:kS\Pmo=ś߸hK]ŽHiƀ%L%">>ߍhՂShZWu/Zu|~vPjA.:ө3 ޏwO\^2Ɖz_E|n9=I޷MiйwCx;.\Zx4a6.Jiaǔelc{a JG@{Y l H]OZ:*C4=,v~rY,d ΔFQ!N{it0`t$+Q\P'P2jH^x8O7S:!3 o>}ix;N4Rz{?k.3Ma.MAR`5u Jzpx1B> 9g)4A4BV;)P 9B$Qޝ6I+YhRD5E| B1)pvqu`ﬡ}!8~ ׻1`w{uMIfDxjopn GVIXOn2GI R#(ؒ2 V'ӝ[l-eM~:nSW zITXo ԱTa:m(KH/1"{i<}zA?MC)XZ7A2^˻F 4q^mhAŏ9m-qOCgW ǔ Kλ;WW&߽_ranqTӖ-p#CN7z? K]zo(]\S1}ݛհSWԺ*Sf]?u&:o{T=bSR-b_ܬ%zsJD_qt N7>/8j6:ʡwSP KnǔL:]x97aDyW-o '[\ls' ]jmض၇EpF-fV{KΰMm=-k҈ %ksu`?oCa(L!/u*%&}2D?g=ӫ_HEwn"N~K(p0~ߏP7~x}wy}_5ZFE@keNȷ}yo,?9A-d[ޞ*6]L^߶A4x$O˟ggx,lY74Zñjk:;;6]^jﶾ>I#^>1^öG]o.{(ϣPQ~c-FxJ-R P5 oaay"tPUDra%L@#8ŒDb's@$xLG˜I-"9 WK"ʺIL AZD.P鍧J10.+dVq :ʹ?km.3X}֙0/bh@tn lZQLT  iA]ldU"Ul3Ub9靎Y6c: RepJhi z.}R+DFL!NJ!TIn h_Q+c{uֳߓ@k-!|fzpTU6v;BPk @}0ʼUҿPVTFU@\O%;rV'XN(D`Gh)! %NPMekqD@"MF @9Vp))\Zύ$)]]m*?!=#:j'?L+Mx*Rl8}s Jgֆ$I#b?uQGHA~X|!^Bqiݫ&d4~yOS6-?* pW WƚTHt𳃟 ?sTL9S*RiNBM-CtfW`Ǖ5;+ޕd:DIW>sŤIM)lnC'|-l Kyt;~a-FxQB"h\:R@,j,s^Ř jA- ɁFECQKmB4[&.gZ3&U^ņ}Y`GiZcA>\;YRVE,˗,)J.Ne᥼YiqWWIb\xo+E ._Y)s^mg?"cvs8!F "y 1I3 $RJwT(ԉ+q+sh=Cc ELFa3KAHv Ʃ&hKs&Q-"Eovi5tiM V?mZ@ +mg߼}ô4`M۫!Nq?#nc%JE5R;թ޹**I]Y Vvvod)xsY(DHJǟ3&(/|D4"\r̐%9{1R48mtHmJ;HBCB(4 ۤ8`( (OVx*+asu" v'L؎by4G1wI1EQ'/ev: T1 LSx!9*:]w e-tJN'i#m|+Kr+-;B,3}6lxqs}ϣ=4<=ҐN%DGrV*5DUJjcISEK,*BM V;2õFɝ!hys4qFMNI)& Dƨ$ZPcpruP!"fDy &g F:S b1-e{ MڗVkUqN:m MfomM0:R ]noo36v޽ktkAn$O[uV?|&:uմ.X%~kgQ}s;}C흷=ϵ,9c}۫?дoyDۅ~ayoMLX w:aS/ϚnOk+;-Ź显˰K9Fbaֹ_ȕ8mnCc2AA4I5LDƍA]jQ 5p(ᅤ5əpB&;n`ycSCLD).E֘D L&}Ju2@H2T=B:$X1@JUR*MJkbٮabd ya];]xP]p3xM>nsqGķte'iP`i?~cklpCrK7ܺ̔˔sBrHY6km#WO`yvfn0sHmdWoX"ӱ40c;j6DV:.eV4 %؃$ AHQ`SIN>Ls> Qy#f\K;Li#<ԮQg})RgP $jP*r"jYb 2<;,*_6bp@b\Q%ZšC!HE@HY')F 1Vv 1/%9>rsɩy&#"9nef|KqtHa$Wc[ihZn]^#v({R*t掊S\wOH*jyHJr [ZsCZGEF%DLIQAo,w,LiȊ Zea`HQ0 K̆~VX&QZN^աun 3Nܩq2i=+oEN1"HRP2𲲝UgO;u:9ݥtc^9<))5D%rD ϣZ .|eV y,=p)"]rB,&"G/Zn2O'DB$ZըVIu'=,;(*a0,YPdf/Rq[A)l#22|ccIyN; `K0vX[ku0Z9(sˋclЯeV\!.ؖ4GS_j|kՋ} C 0(rfRQ rrHPaΠH# }ք<ǚU%jruTcz1+5hFjvM֕?]]((>aMYū'l.sN|^+%%[ ``H,N ͓ЧTKo)))2!+t[LTȐUL: .lbYR"t+he]H@VF}#ҖK5X/Dc,'DtSF4`2O&0cZ!p1Ntd2d$/v&/=rY2EՍLRSPm'"'`I*Kdp>\RI$/CY8{zr<$ƻȗ))CaǬtT yʰۼ Mz$:ȔhB.Cb9#x9l^`[^j@%v)F;{7ec31ŅK4tpRkf]"%\s\ )q")I+%彍M1\}ֻn[ƶߎŠ~9PrjRVyrVVy(-hT*kb w(Rڹ/rz!GpC7_$3n4"(D@%-YEٖ#$ń Q =y瑩ge@0d*v[=ubr.>y836p GW6r\ɪ8tY`ޞj ңԮ[cώI7vp:!&_$uK qE(yJ)CH)f RgFwaF_}[MgN~~M-fUs1R7*sxV<ǰ2#^Oa, 6-5J[+l#E]RDj޾q3vc%K){4WAP0msxv%Gx/ޣr@}ª5M+^/wkcqk1o G{pl{@LK9nݚi[_|BPD~82DN OϞ,rJgO~OٯbHg; ,wM~^N ĕ#wE&e.: $Yy HxY#Ǔ_=bIN胎O:~Yt& ]6ɶ]_A^+/N./yhwϋuYt>X^yJ_=`5Q͢:3g ~Y>Q j{uκӆy7E=N.O'>ZͩrKigMlԭZ!iP/-~OYȳѕSxFƠ "H601d&\rt>+YY!AB(ID) [})ڥ;tL/MN0.<#}<0{7o>b]bLN.-+oKe#>D\VbNEM'nCWD(||z`|DpRYYf)9;Vh wG'@d S9Kˉ_PúT=a6ξ =['2of=1MNKw7a-vO <֐!&cU2eo`Tmĝ >Sx&CV!ɨT 6/m%& QhYe-RLYDUΨ"?C{q[=dHnt)u vT faL+gĔA. YQH3֫b?jBnHQnn!+ ^ xڢI|GG2lYX \#hqőby 3B)eL+j/zܐ#O%jLb#T*F,d#ը"߻È,wVN2dp?WVS.W]aN4S#,[$/_LKe76&O'gkYXLtr2"&f-+"i{FL-eLlF۞tanTFpjvUQ5ߜ_rǐ;?$^}A|.]{7h~)wb]$N&1{MkuktMSu؁#,ossof0m5ߔd+&vc.~5C'gOߟ\,A %o26^4_^ GUZՋ2pzd~"#yafMNjw hZ3˅輈'L]BZ@s7):Dq>8[4tevp(2?K6.>ӭŅt077 WS&׃HX'":UK)Z YƲ_EbC^0&x8Z)x[~5yyq1Lͳg_bּ%Z%o2K=|84__LI_MJz떫?R\R}w~e. s,{.ŋqr:0zk,STH(s`݀~&?`l CS˖_M^7 λNye[KKds@\BpzUP(uAЃ`6q;/g%+my7%hS4e *NBH{}_{ܥ,]AL}Uҗ;__"^E%Bb\FTae]\EI[ll#lK[?, mH?ȡq3\%wE51r̎[OPY隽#>K[Zi/hf,S;f6٫{a*{cDUjݽS>oR͒0޼lx"V%ڶ вEtKg&g jf]euv ߃^+i{\ҍOGKŊꈫQ[Q_ J*5'Ջht`O~:ͧN8W9sh~LeZmhk˱5#0aS_?g~Gc*wz22iYbv .p'+y_rH:E, xQaN.Oܹ4C:zy!)P3x-&#se[M}juvJWTm\]ؼ;O%B9d&wqH4OlZbŗuಹE >mmF3ɱ~i$k4e`U?jXU,'xFhiF^}"{V۫#}{.N[>V0>)ڵElmb1[8frZZ]t'26 )J-Ăqy:(Gj2wËzz.7k x[vZyTE/P998F˕1[ %]ap٧(Q ||&qv8=ϴ2_V.1{d.v< c,97lCZЂ"zz~Q3ml=  殮Gdz wvf <%ϓ@y&٨aB.iwj8,;KfBȤbjZ=y7Kz"V0FΆ˝rk3VLˍ-K75\zge&dhP L5bRrk*skdL|BF32nY3WU9ȄkRp4H hM BRB Шg ǎʦER0ZX72.&pKiJ$ ݵ3:rUr#&gO4m"H*xA9Z)#d)Y2ZD iTY;QddA*eaLTPz9X4ftAɄHIޝw̥<Ry~-qoGJQ+DqDpޚ d,:8.J7O߿Lh.M<C8A/NF 4\f4!rJhq` ?O:4Ρ\G]yrzr~!Ǖrv\obzl%_&Pf4Ipgz;$\Y! oۓY(G.2xxtIT䤧x*'&̛TI|"NQSn7- "_xo^㻟^ū~ohGtmkuM-7eͯJVW[ S!*XWzdum@e2c~5ŽM[⡐eU~3w zqYYꩥ\G *u;r;I:ڰ.m'>10 (eyM{!ňª<@$&OdpgK:U6uuOly*p-8?dkg/`͑[]2KO$$c=Tk4{]uP,c&zVzOO*<;(8/E(K&uIVȕ߿a׏G"9aTZιK$e )V[os]qt<'.tI:t,/iA\ ]:fgg?W" N?pw7sg/GoNƋۜux+cywMnvo@79m7 !nl.Nޝ@|zQwH]wF]ruUUWW%WW*3/J<8DTك:='0W/ާpQt~w.ƂY{}ɿ.Rv6arhC[? bѭn>/> ˖ G5b_7o^a;n&)7|o̅a_O2צÆ?K\#w{H_K磎[*JЮM^[~0&E'*EnJ&?1ٟ:䜆5͗q ӏeXW糅w$l7e햨Ւny[ Kڒ;|p\V.+.+.ktYYzV oYlCV%Jj9 iSr))a)E | { xf#JrFN5r64ø u[K9]+s6L.יgY[Q \Sr/[2ڵY`욦MOλ6ҵeny/iۥg/fSKFUmtR|U[9[;7|pr=-/w1W,__]M&Z ȾFG^fhv>\K-Ok nUPeL]v=8 e7pmT8ZCe s`I$RAJcrFz ,`i^e_ɛ:?Mϣe>%`9k=p21 <'K{]UAMȴI@!lDbr<@HIH4R[eZxjlжS2Kvi'ɮ66NWϾ ̂㐣2XNWDBH4H]D3ZK1fN'RMl=.E3:%DII֌Y*Nmu ^>.s~Q,5gp>֍FF'2,+>,fDB K9ONFWޣ&%OAGͤi"g{.xQB@t*ۨ3vѦf d65v5rkl?%s0ZwlqԇVI9Q$IKy3 C Yl%ÎM_Uf H(@ ɋ"i511.qR2crv>Fz}9pzRjOm5q^o,`%[!HF/ %Coe2FTWڑ ׉ܥ 04o$ YiQ#EϐA &hLIs2K ܇^Y{h="siHj\^TŽ^㥎Q#j+zx)e^ލSFC;gqDHwa+v-u*Rˠ'B>N"$a'Bn5b-!NyF}NI!!`lLV0g&(MȽ 5ʌD|L.Ikr'gD sme˺9&BN tr@Ys>ݪٯ'2 Wbu;4?{(cF0-#&򉴵V|C. P'f]TF@U4fbD29# Ip%'=HL1 W#g_Y]"s E:l|*?Ck߭87~.f;-<;7}qoO*VL Y!@j:Ys0pU=F7ŶևТM*|hYZ6;R:&0I[6Rv<"ݙBz@fc]NpĒSǘSAȌȑ*[u+i2Z#B Q9PeL"YY5y?)-yL!!TUre1.*Ӻp[/n,UXTu.%೔2 ] 2RZ+ YNH('Kn5]hxHIT̊(,"FI"+]NjHB&i[9М ,8rR[VxΉP'9 %gQPJQVd 6,A3.meO4Rbt{)(?W' C0p3t<9X\-MNfy.MjV8 OzZm؈MSD# JXhA`g:rOeD t]' ̘Ľ⨢8fl(xb }&z xe'ӆE1\b, Z+ ybJb "{BsAA},bDhDl+|`Wxhb.d;x֖h)p.G衄)gLgV~/p`5V:*î~AЋYYX>F#W(v9%snύ{8cRAZ&ِ+mnHC_ffM} B!KyV!6Q'!>ph$$Aa *+ev˶BRpzB49FV|I'{1$#7B}@ n"Vfy&NG&#, eέE*Nu(jo>wHRI`"Fj*([L1 T(:f^6E :=V XmF˨Qx?Fs<_X"+< =橌"O=QXDqIcdS jH9ZQKrqQRO9U^-(Jv O 4ję  ,1HW}k Pz XТan? թ{-P0=:^jC\ZO7Oat ȄK*J 39N`1ED+w"3aۭQ+i1 E5bjŐBO@IdNXOf&3c ʝxht8N'!`%n`0+VR$ t ,8F)Zh~)Hw4G~I+RbxXhbއK32hk="]l< rU3VDьCa->`z!=5V;„j__orŐ+qJe -ҘmQ;ݦvBw]lS\.w}HwT{JUWTSoت&iV{T$)M+5b,KK)u ͼ$)t!5<>ɛɻ jQ `Uqr4р6βg޴.8F> ǧO4G2+kͬ7dogu;(}y i.+d%4]A nR ؟3B?'y?Og7ҧI~L&XԜj3['L5YcBUǁFKr&˕r,d2b8M3t{zơUlٟ"9 43un 99<2M2v{l\^);K.[L{w篳d7ϥ"2Xx\V\^t,4I5'&AU~zACm )볿rOfTTs;kHSʸ\Kr103v@K3t'7vPyImt`>Lz@^+KrHf47̊\[ ^(qx"T[AL<[  P _4ڲAX@:2aXd.Rzg$ dvTnRf0z[̃|zGY>.$_Fk޵U ǤoOPtMa]}!LY&Mܧf3=wJ.УK\;LsѸ]U=w i"9*ux3Q-ުƤ&IQEj/o`QŽe]8u ѩS"݋OYYqw:ȕc,|&1^YL*KMqz5#+y:9`RQBT7x<ˠq!H_Eg}&MRx*(XAS8)SF,)G׎KE~XےS:і >E\x]7 DqTHT$]AQld@Pab%Ƌ 49mۓ} wl{2h1`N9o[X6u\ &`͍*OI/d!{" aGG}vrr=<KH!*obE-JٰNgc l;y +7ЉiU"bK(tC[ I(aexC9cAMNc&֏ GTiII{+St~r gBxۣP7I'OS3w O!<9; pfgݞ/L*~4*6`@r ɅbNdw}2 쟄Ϋko%P0 *:maXO& -,Iw+0n]w^ec.H9*| O0BWSFP#Ebڞ-gwta-W9ČK5m Auąz`?%la|k0yW^̦/ٚh*B7t׽,G.9GpP. .^͇BM-)[75CaeQ`Xk2~8]=Yw*A[զgb6+Z-4BBRإg c)>ٰT/]7WCE&֍?^ ĥXɋ\35IcT[UUV8UR>bI3]"/_y~~|q۟ߜxu߼}o0/R֦.wÃ:@ҁ[M{4jګm4m>u5[Q7UE *ftPx}Qk\(FhKF w߸$,1*)gZ,[d=ǘ <oEwP+X=?,Ei9!`DL"O"rN1&;b-b;b~ț?t8|SXu yP6󰝊< y$,{! c.I|+]I 7<+~>VF"ϟ%.7fqa(O?_? -}To_=L췷uz",La]=B |hbJw1[,\ugU!1Ei_-M-h)4!\[Ba8e|W|oMn9*֝wj;'pt6LJہzfsA vxvӌd 5Xc s7Q ԃV&(z {>1U˯rr-mHE`rX\C݌H$$_)є,٤DIl2HTzJAh2I$K3RpjKeX'|as5}]ᶏ JhiM.Nj Q{! @Jso%TInӨRd|"FSFdD !A8F2i&H` e- T3Ke=<,*3gрMs1GfLf;<#c Uzr8s*AޝyoPNr?.d+Ӫ*d}]Y9OOWZ;2߾ ϙT舒`MY4Ih.c$AIhe-Uΐ&$47#OW LW~^jv4t%+݂@WN=\]Qo+Zm=]!J%@b3$3X @aٖd 5J@kn-<6+(b-8>ُW?01 '_E)k*mNygUbrϘ:/CSX$z˲7_!&/&Y✟hӵƳ$#󨬚F#Mخs5b ٗx4=ʒ=^] NINi^/Է*hsPGkOG}vl]-tqy!3lw,ejA)PMztE;[i(˰7t5H:__wIeJ( ؁Zp;v7O4hTFa+ bh~/G!Wpꊦ4Q+\-OsƣMQM݅b2!o/kc˥ڀ*'T#Q5b\֛QM#?<jjvyC);T^ЛWZX#{rQ4;\XWrѤBqj6PQ{,(:T.r3>~)"MDdj՚X+Eq"\+jwW(4%B' 6X ֎p{JWoq1]җ@=z&kDcOs`LH ڙ ]1<f9Wfx7Ta/[\)(LL$uOkbtutȣѼR)(TJ^{ƉM,4ұ6qB_ De7Y]X8*onXlLQ5g=l)_M?MMe@cr4a<^VU#E-}_~=*=7#?[ b̡u{^4k!ر jU^ۍ={hȆ/05BxvT~7/'/5-~e my[}:b{e|K8u_轻p6*Id&o/_K&Z;竼~`청Nwg9P["ݪ"b uW}T SVFhWXV{'2wK&zP/oX˥\JO~}YT]\.{x7_nn,ab;䰹=miF|h,#Jb`;aFʱ5`S_~kNԑgB7Qf"ΒD`1x'ܹdͰ\l.kmWl5n?<ރ_X Rd?+v lP6|􀭿Lw.iCf_*0if̸U#U]l?|AE[`6i~z}JtXERltkskPza!қYK4N‘9yG. }3hUSF9ftUPM Dxh2IJ;Xf0Y2oTgo@DLBYڊG=aD:]jF6k/-F,v2{Hd5ˍ\x2gv$7EV\hܪW2[Ѫ޻Uu\[UYA0Ĩ#\M:5.(~tU\P)x r ]!ڙ`O Q*:ҕV+lH4tpm416VwBlv4z3& BHqfjW9B;Lڡ1hZЕjߩDZgOUD%]] ]1 ffg/.'׮Sf5p ]!s2Ci@WHWJxDt5%=wvh;]!J@b#+h4tpM4+@KB tutAe/oLokA12;o7bT똌`lxjDU"\8ZZ. 3t(]`9EBWVھ tu9te95"& xAkI4=t(`  ގ gVgVp%=3]C{PEtew1WUgD5tE! tutDDWؒh誀kb+@(;]!Zu.` wձft(J%2RZwtut%"5oEϊS7p;;c[[XNci,Y44p}iDi@ӗHښ3븖mX fBZz/H['ؘ)j੭PK}*Z@+kׯZ&>lיZл& AC^?l7f(+e+1V\k3ɔ0IyYʄpE`F6f@h#eœ5hFDWX (+BBBWV>$Qj1ҕ]!BiKDWC%ҕU>EDt 6%e]K2& L rfc%\v^%Zs|PJүҟ4فzV&"h J ]!ZNW@W+,"B.N%hW^*Pځ. s6j WX kWR>ҕVИ+Ѯt/Ƞ]]"]Ia}}OFHVhnj66t€U{Y;BWVv5ITBW۾j +A`<"xGCWWX Zw.$ ]:+k(U+ILց..57p13PCߢX8FRݤ]tYޗ3}Ag%MZvLKYgL-P R&>TmuvP#"<-gYExbBPĬ , nϲ#( b0t'eWmy< BF,i Y(osx-?TYg(>D;4 hO߭Z7+ Eo~Ђve;\._$`^S|r|Lew9qbziw#~9i-~ Pofwʌ4M274Jlh(af3.sܻ+"2'@}*Uwz=$(O@(:& ѩΔZ2*yL<4bQAXJu*E*Ҝ 7s`j=3\8/^rhln KV BΟTq2xaebeQ"-k~~[Vln|^um.K`OrV[0uvd!(yomznv}[pSzl%Ʌ%ih9֙Tei.|Sr3*긤^QE I4{{,5齐!xL:Bp ԰yAL}ZrΫޏ|ͷ32,90lLiNT}j^Ͳݓ箥wEN9I[%( Rў_z!l`@<,0bo,km#G8`a X2`w,؝"jَ0ݒ,;zE%V7,_E7+xgƌ@pBfm6,šhG͂bӞ|d6ӴӆD1Ld,KV^Ą))bLL;ȁ:hwe mc_W5v_*|m#9K.z' xdmf¨K)9AqģxXF7{#8dWC|N6O Y^"I_.DQ Brd۽`T8`EՏ A}(H5 rXžMmn`R]{T]ͲK8>}415R}艠z@ 储/hK&X}92v{[ǂq9C W\/AUpċ ZXkp:$$C]hǍ3{Agtˍ&dcRks'DBc:čZB^> *jVlt@4 E:fz\C$Ecغ%Y4AdY]fu^8e 5OE+GrL*a Ds' ȩL5$JMU"?ʺ\&5Ή HD`oMpolg<TDlQ[%QvA"g IIQMJeM8ΣMI6!IPdӬ`_PPT*35('x>J Uo-44E_-L^ vm![҇SCCCrh>{CzB,AZa%$ hRI 2RM 5Dhdbo^Pcp[*мdhuYfMe~嗃L@$%, K4:\RS&'.і=ə11qP@0H,:&%4Q*KJkbl\ Ӆ8cW]Xx<]j2#{[:ۛ6L n8]k$)}01XʉArqѣ%")6) ȲLI0dcEF6T"jfNhcU)@"A955q];v⠵SC+VAҼ͎SCQ@ĦsP1*(EaJȐ & 9 #(|L27+̦9ag BV{cӗ]5,A#4:WYj, ZxiD Z-cdAA!0 D8ՈąZm RpjҔG-XzI0(9ВfK"䃬 kblֈgڣ^lgK,%EUX/^<ŵ;iudQ SqD%AuofD X-O Szq_a1#_2W.~]5 qqW*z@hM 걿6!Uy"jNv\Zj2>!| )B4b-!:r3AO6XO [K "l IqdB2< i!\BHk`*Q @M$*εu1r {E=snphi54ycxh'{\+N)W`T8Iv11 M@R׆棷m~iMqG=^Z.qdٜd JufaI#8J$۔jsi~vQFX L+`ZKІiP7*H T9AF,ZV^g[h?JeOS~CضaPuS+MG"쮝)@d(,Z8Yjosk{IT_sOKwe(!1+z<=^L@hxf Ȏs3'կ{]&|ԓK5n&&vKE+9xS .2^6&AS drŽ.[8<s?9j'Ϗ`ho8|H>rK sr vi!h&|J՛m"]wwpS]Nt%tiZ_NN'NjjBjW6Yt#"vޱ1牳b|z rϘo;=ٵ;/ݍgl 1[8wMޞ#.8Gftx92|h<j- UͰEVy"u G0bGf6mdM+[ed}NjuU_5[m{f>?!W}B5JUVVR{;MgM~g";?pv?jyL*pF(A[iU*I*]4* .PH*_?_?;珿<})eZV"™#نԜs_ gbťSDVa9h䬺l k'տ27YFFYk?CQhNrC^惙ǣA5)r;Pu}M5ryƼ!B9O wZYw]3nV{2n ju}k7oJ#?pڕsk_ m@[ d{@Z i/9GZ5O`z:-­_ 1NM'ϧOa'K׾L87#XG'e1<>?~ j }Qu2E8R[i*䬛#3ǝOONz|)8 , y~ 8uxTge:-z.+x χ,ng_-:0v:^v[W~2scjEV} *|fr9n^`W?V~wmI vNΧM',;[}15UK= dJ4:RJ- 2I'Xʴ~yޭ\k]J٥>h VQ%^i0K$I0^*pPQmg U(K;Z̏Z3}+LB1ӵ1 5$bمdzHc"ӔG (}]<$Iͥ*/d"O';SͰ0Hd&9ȠV JUk% AW\ch>6]MB4TRn95qfmƍC/w'pk]nrcp>TDVWdxb8j^PݡvNKzT;'3UYm֪JZ֪{֬V ])*7=oRuoKMI4[v% ԗ=Bh͞:)W`QS4vM$Xo·DqbDN^|*lSG$YLٝ惢⁷ 텀17"P" S ]Q!EkhxÑ>lpZwYԝ?stǚr|d:@7d>}79͡wW^~;u@PB)1A`2$'9! oϧUS o]q{KV\[ĵ{\K{U >MDq=ON\oph/ňfcC@v@ȀcFȉ?ÃtgU%C<K]jc MmɅQwʠ<%,eJGƕS߇=9#MJýX%)$3\d]dsёxrR`q>x }zY[ Z jO"w{g6clNVh`!w{/s5,TD(5H$]?y[vHs=riM!Z.r {=7ZXnwMFiTɌ:{Qi"!myߩ02QK-L2:yP[TmmI]xZZ!:2DEpAoP&<a U[k4ϛ)[oܜR@ #VI9 1<1+뇒%en7d-D[q$#0 zA5ㆽqL'/Tہ\J8.pR$YJ6 2M20(r2^?&<ؚ8_. C B2+1}oI1{8j}}M@]x KL^4k72rg- 䄔*FKR.Bgo~ ^R0KN^yJsؤeNXYǽgEb|bR[Pzjhx;mrJ־u;_GI%-c>3(ky6wV%KJ.qL28,:>LNj͘ ;T9 nM0< /uv~mq7Gy_{{,lwqqNpy\MΧ!R*A(_C #L֥4g!,K Ni!]I9n %6hMMQh!Te}Hʖ!)?. di{RڛT2V&U vѮL]S ií M n8~>ȳ#L7TGژ:z&EeF#*Ѩk7ЦUluz[wEj?,jΨU^lX6I[.r<=.@釦ӢYvC tgKYFiZXmwω`UɌp<+I^m=y_}v%AJ<) 3Oӆ T(EL<%ΜJ>f(tHn+tkÛϬ}ok^y=xjs?yjQYg<5Wخxj(fڃؚCώ ){bK+ WI~j3)>,bC$CpBq \C+!\1N+XPBq \ ԽusN\ZuP\i)#W(pxJpHA W(H)?tB)9•Z.K3#T/U.A|Mhnpּd%8ED$^?ū_?)y]4Bo`p{dI-03pjh{v[ͻ;V%kHgJUR ~0StP`ٝW ( \:\K =\=BNph!+WJJi=WF Q_ vP\BiJiiWv P`gڿ[J W|͡D mվF=f=F]XWIDfqp7+]Jm\Q _MW U GWLQfW 0W3piW $GW\IW 03p3p3i59t)1•Pr!B- \ ߺW([W(%=\=B\ۥ AQB pߠݚHuzj(Խq?.wG`][˓Yk^o\+@uA10/hӣ8j&p86M2rA'/]S3\@ҢOUҒVskoUGM0/'8uaZ9*=Bh9 ]nIvxʓyF``cy؀ >"b$ژwd7Eq$ɖjŮ֗̓q"2Qp6l懿aCa?6w*\~wqwџ?bZh͵nڋ=n=~97 {嗳%pq)zhɘC 4PZ+RJ5cPKh=/ܸ-@:] !]`[R> ^CWNWeztø `bj 1@F ]n9|ӟ{~p9r/ij?v?rރXKEj?Q-VSKO2ǤIFSSVpN}{:2 ~4A}vI[|\t5b.1 OuOP't *+r.1 <>C3HNZ]1n1t5\bh@#) =%s|{w=%Pk++tSOڐ]]ִp oj,t Jk%Zh?t(\z>te8V +*.\h v]oCW?~4>ⶄ6;/=*Y'ۛU^܄殑Wy*oz^zuSAQmAG8{BOutC㟿_馤Oۇ(oo0ތﶏoqpt}y@>;c~=ۇN*Ny*'Wʄv/7P[&%}[o~nov+WYw(3}n5>0>Uoq^[*TT.@$P?0_0,0 eU'~|xh~| &qz#Ϡn~t=m˶M9_#|u99P1چS3e8]H %S~?^_@ԇn|q}tt5ᄏa{s>9p99"uW#гn*d˖آL6jTm ;EjQ甙B3 CHRUmjSUjt7x`%^Uʗ1`iI}~4s7ʾuUQ*buXGxx;t*dՂcNѓh9R:o- A(V1K&ׂɈH0jLzNѤhYI`}@-o jwkK8߭"R2Ylm#NM#$JˠSm Tg1]kGfhnf(:&.\T r„w_၊&3G.!S(hVr%cw ۽HamPM:Nr*0!?rih*kR=^b1;[_('}tB ب"!iIiW秩 >BX#őYnj*`ɘu!g5>h}j$W-wvZxN5bU71HGI+*ۯШc[M1ɇ%]DDNE[RHXhhM?a7 ҋJSh#>9* X -$ZjC_2*$1Vr5ʓ`ȑ/S O Ɛ5S&8_c!cfC)$X2n?kW5ZFwT]j0I/3 E U4S`eL9'XEA)K^5TTlPtԠ-;u9UsM+`5GeeD9qnBI':o@28!φJgǚPX[]JP\Zjh 0ճk.keB`oP\g"NU֣)[ݰ֞҈5TuAP{*'*$ԓ.e ,^l85m54FRĊ ̆v6I6p.LqN +QŹ(Q7JS㻒2 THtB3K&@@37-V(!Ȯh,a@Lћ!ՠB+X7(c:omҜ` ePDeBEA31ۢU>B:÷=t,,xͺ0LX;*q b`.Lƾ8 r|K@T`-^AA:: pqì AཹV)(J892z$XYGyGT),d~G]m ()gRݺ+ъWa1j 1Jui|c`+/ucHH/gNBk2)- 9&tY]D d÷{6{xWZU5(4-d36>PbܢA{;uYdcL±yUr9hX&}vLb]NߎO[Ӛt~2{[UЍ+[w mCka%a1-A7 /Ml#TYYL!Jrd@hW!˨c*,Op4XaxVu4M"L輺`>x!kn 3:]s4/= %$0AcN1֔1h38hGY,TGת[DbYX{ʶMRF(NEO!r؜|di*YrS ,7NB&ȈGA]O9c !/f90hTʗ%@_>1gmDSaʠvКf6CܖSelJRjhNJ@A ע+P(vm!Y5Mh3{t-Ac:, qR:fW`m4JgVI5NfWbd~ȃR!*888"vVWxV5CYeXJ 1, -flk'JZbB=`i!ud1mh<\#M8nam@\+w[U #\7},edYn$Dm W F `j\U;Z@4Xy0=-n~rynz{dsn9 g n=CnIf=)SZ([` ^;-,fGm5=qo^y"<4F&x313@9>MhV{ұRpݰ(a!/[tؚbCnBoHk48f0J1cSmyի7/l')vVR^@n6itz:vtכ_VC:E?Oj;:>Lx˼տo.J/_$qh~9^ƶ۫H]W:rI{ /V㻿nO±\n'^ջtzcjs[ sRnrZ?t__`p\R)rӋImbJ9&]X@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $IrXaII j9I WZ}IKI9&8-I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$MEHVZR(g$,3ֺO8$O!IH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ M1$b@n4KI-Lء'J͒zI 3κHH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ |@n}pk^ll5G^nPpܥzw*gW_WKrKF-&\2cϪCp㩇* |֘zmyP꟮ʿn6?msj1(V? /6w ?nOo;/+jU>y]l"]Ko6?mxQ!Գ(޵u#"y`"ɢVeߧx*oiQRK>#[ɪWu5.Y çz#= HB*YWm;Tﺦ,<|jvvة*)~0[!֫ǣˮqF@uk@v;+4@OǕ4-әG* xN{\?J|}/K6R+z#S5H :A(ׂE VKE`[6t6#;44߂Smv=JwmZuQqAֿҾby{͞g(ʆ:Xc=/@ǿcTNJl=GG1_b!5+bU73xLnOkΫ(JSh3v=O ״15&EIh+pryTZMx~Hwh?s~vi!̾_f2 ᰻2jvkw2ռƷDWI"a<r A!rhd- 2&XA}AYl*-J6f8 d<ޫzc=):r][/dY 6Zǩ|tO&OȦIBfjѱZ~NfXؗdLbdEdẐ3%% so_09/(ߌ-y !)-*!)I@ JX|7qC=wy>BR`/yo{.]l4l~g aKG hZӧHg,2準ͥvh@pFgbZ |,@[m>ePF%مQ3 ¨ڮ8vx(NIJr"BZ %V)$eȆ %C-!5փ-'jhS7#M]užε2(3:Hセ=oѼbVwQ0>* Cp`zpFXiιI<Ze6md0-\*|#Zr1EYJ՛AfPGjKMUr6N&bm=yIZd<-:qUHJ 3Lp,Z/Ǔ7҈!)6$şQHCRvydnKY"ENY;¥˼(c_v)Jx >OܑR#u `ʂ=7n#R.UҊ85 o5f</ c\JIMEO]C@b'KF '!5ik٘!ўHgVч#z@2h3]%uk9GzAzI9I1RpP& >EKxeR9(1'DR1$:"m]&|C[]u״Xo82|LHB8gJWO8ds'N6һP Dp9(@Er1~1˘B&ze w>CaD58 (БM0SmY rB[Ի_hCXg.ɎwS8ZD5 v]'c x'>}A Ƌg g>>j`**x`\ƮG}"W(z/AUӏmB^&.j:Fk)\~VP8SJE!X Vk=cIO=5^g.=Lo<70ͧ͝uGlzLG-t_,1RZNA&S2~gc`Qgy)7DrJqˬqi)͛(LdSXdv}xx3=^ƃ'l'Lx!RQ[4EP k!5EzrBŜ FQJP"D[kFΑйn~ Gf/]ѠFAram#:\AO_3f؃׮Zpz f3n.)%r/iq[@uO*/$fԺmwO9tp:Z}y~~'oywyB}X#:^Xd#7qeŚ/g{Ԇo5X?]ɯ8h6,{|O5@=myn2° )Ng#dGQ@6qiIbMr% V'mDo&!nuM(S)6XPYk㙔F!S+4reo6wsm?9zcJp/X-FKqD! . eBb}QLWCk[ 5wAP& (!)^ѿ2:8򞈌&GO9Gd=tG,ɷ~s}n''>e9mmTZpveN6`@!$]>Ph=d ֻuA=IKDK~Ե碄$Ճ'o\.! }) 0 )֚9G5*ta38U6T T3xG#/4NwOp|7Lgo]cbu^%,'AA>HF*1hPˎʫiYrTMe]":N v܇"u@lIXxrx9x7L̳ڭT{#ԇAg\ЈZB / $BLQ I4֫r8?Ҷ>,:HQ &+:)tk2TOTp$E b*ޣ͚sXvF} 2Vx66}+8U#Q׈8§`d" 뒜:,PʘC!J1Zih%BN,U I^E+BYҒ` 8SnsX#]!-u$8]F.mch{Ž_\h_ ; E@0d=$7$ ՐU)׋OE6W@5 ~T >}qF,%YIFȜiěA M&B)i'$;/+B!:@ȓf!O} 1BpռĂRpҞ8@}@BJ-ʤOZ:m JK[i MsCcd݌#!'?9G3n>.i\|[Sj3<:Y>yā! [!3gmXrNCq&m$'ha֍, ,IlQ~D"K%w;3;3;(uHja7b&D)R=ط[K{dc+I BDpCF,s$#-qk\OS}iJ̉:dmYtM^SC`^S>9*5qyVXuRq޲]Hw01]4ǕXI(7SbT"Ba<ܫ.{o SG.\zËVz|<ð{߷/-vk=(,}[F:ח@4Cf3)Y$I8Sku*pIߒ).$Eh)#-к뢮ܷ2*CIN˓z&;M>KEl3u@`HŤ|gKUbslq4 Q4z׎]ժ۬IlIV-WuY[WxI%J1N;oUˡ}JjX(W9ݥQ2FEa< F?o_*׭*$rG9)1Fxˁp`ErF>`BҢ!LHw#y:.x )4J[`1 0>|{i!cڹ%[>&}>akqn\=0\fO`|ӻFcPFĢ0$iNջ=ǐՆԫ?o!l_KpYg̎=_?(bGYUd!}H]$#u7l+P0EdHA)nE8b,l?t^M*aWi @{2!+!rpl y(rp(Ge~,sj`|u%A>ܔlQ.Wu:K9ƌKM:_y1Z c'NSx&F7#Le oTΪ$~&߫ei}`M )̗n?'R=9H_<Nm/BM#)8G:k4P0HևO0i.>.&]uG+^8*AGdӨM*)cx:jN#ÄB^1.B6YbxgrClKl|,GY NctZ6$'t63?%9 *+q\?@1]0C~՜)ɿEً_统xϿ;D??>/XQpU7@peO_bh0dh fͷ'VEzq4bDugd!bG򜧬U>)$,nspEb~(KJG›݇pIZo*(6<%c̄CNJG}@ЁKHml0#P ( X ==Z8g14iP`RNwg:a4>ߚ,;mcqy"1{Rڻ|g/=-<(V ːsEl/\r_.d[X̵Q*g ܂Lx*ݥ\o~ XDJNH; ?6KC$ʔ,4j8vslck U cJ#h6HFe <}KRUmst!}EnP4tϑ^XuZ>#QYl-QvIvsIG$ Iؘt\I_&mx8S`A9h2y3;خZQSmpwT(i6Vϥ4Y lҎvJ~- \rAi髵|7{Vȉn83]-mMw 76Sh܂bI^orqk_sjݮ6fHXsz_5E ኺC+kpPYFvY6Urȭ%]۴ܵiڷ@"&jdQ^-בRK(rgB,Pm U)mu5hor[Ni>}>ǔVLE x<3H|4NY=BNTKpFc1` XETErTJ- . LxNC" _@vE7]Υ q4KǜiI? ɻӛ?cc} Ak^23GxI*`O.!FcQB+ P2c<0ptw8wy4  Y橌"O=QXDqIcdS lHOһ%?tƷcAWsGOjAVGHV%F^Iɂ(s4h8aAjGQV&9k}։$EѢfo*!8$a:n%uB4%y &tА1s6^XV X8f8`s&E`1Aʋ-~o:-80pb/_]ؾdB2 Xe5=,eѤGv)$o(G˫_Dݕ;ѡь{Ny4p%RO<^CJ)Eey)sH:Y}pt|8g>wG[сOV/Vjrn-S>wa]>|x4`I\ΤtR 9'5Ő돲'TZQ&CG*x<0 !rc|hԹ $Ԓ`Qɞ ,,4~ToxgQZ2ʛ2P9A^SZW3wvvU?ӵAU> ԡ6h7ZoڜF[_ h|%gڤ0(f$O;5E-:n w˨ ]akeI YkbDaoWGͰ2tShqI+z)O38ܼaէPKiJw2a%b'3! 8ZV1xa>1Me~{uκ|-W=n.plz F+|hKd> tSys7qgjhW/c"S uݍu{l`9GU9`_]U2cb0_11Ve]~W@TbC~q3lM'p>~raRcORW4LPzaߌx' DKUg.V 41 5k3$ ڰ,Aw@%٢#WZ8mG:"LJĢ`RQXjS$Qq\;b<1\ ?\z8 $W K Č@)C#E֎k% t$ߪzsіsꭓEpygεg]D^KDFG^>/ʵq&~lf'!&s)ș67:onL,c05%a۹+&9NxQ <^I9:o9m^gR?K%.:]&TRa!HMUe8Mq'=Fz >@(ݠkq=WSý7p)ލ#LE x k>X`!7Ƿ)a< 끫=ۗւ"Yh'zPb>S袄L Q20gnwF:x\B)…-$B2 .} SMtZLxhm {u7եr̪wꖷI?`Pl{FaoȞyU΁*'2{vA`?l.,b8MycGĮ<v챰zvtǮ~Dv%)F3qoP'Qo Xl3TɃxI4MW}n <\+|2yEߥpo_NM$/0΍h2-3<3#-8:dRV%lVYk.Q;ЭYNBI>DR"NIr"vҡ)23ňΔ*MLVA)gl 饥X(^"I%kJHYk/>b+•I$kYTT(Хu88Ál{-?U:ˮ;f,]ɣPm}1Bll{gLy׻u)7{RSE0JY-DǻBH}"G:~~*nJGXtlײ8ƤmkcAa@: z~o|ֆ{Á]3t?4V7})nVO'$dE%*`Jf6X*FվFBTS NA!bp$I 6!`l,BR69TyXsXYo@ oϽZ}黏Gǯ=,PҝrR^\ -;Т3ә̼o\tg:Qgd\MɧRssu,N#,H*Y[ƺAqLH+t!&VhHX˞v|SHp~/Ⱥ}^$ 0)HyNjm&n`蘞#z ZY_.zy n5qIFD ebė!@B,:(b :' l>qd>`n4|=6z/iV:BTN-U^M*nRmRRRwXm)XM$eqKMJh: RH`=s6Ht"A#?Qӂ1>)U*Av~IxQ,tkvl ؘweQ2)7_C%rQqVeHY,VղS/ӋPt_>ugg\$ﳋf98Է_q97>q\"P0X\FX/(e-$RVltE$0S].&mAY{"cS3c27;D-2/yfd Mc]i&Αo!Ȩ)@fҺ)^"H`Ρ8XʉȆT,)֐8qhWa_~gnkBw![+XZ1I(Uk%(|HDXA'U0Y\njTڳF  zV*yB0T66 IMul@:T m]vڪ1Ʒ/Agι2 J{(:t2HcQُT1r̢heF.D!ASRH dq0/dI"BEm#"- D(G#u c Ҙ%BKX"Fr#"9Unzνܫמ߲.x1-_~=A|}j;j;ou}] Y0m7_k;UNBm9O%V `Ro7gGw<:[yEorO//#;mV7\vo G\&B~YY9)} oT>_};wQY=!Q|c3{Glc}<6w=7!,9ˈ~XStPF }tMg'5Sh*8~ =i.{.:Q -I5.XH;8oC(j4ր ULd%J[ BFRcEm&aE>a)-Ɲ;SKw{ur/NUFPAOIQJkjmsgVif Smnl h j Y]UdeT||:NӋ[lQ1ANAj:Y?c-,2YgT.ƚfZQ 6D.:{1*'әUȾo' eJ"֥Zsb,q1c[S`2TI+5YY@M#6bZ`D``' Mac`e 22ɳ$X:RGQ|c.6N5yl]8ao;W88"ьqV $`Uf LI %osVM(eJʓ0RS@ aߘ2ިR 95AJI+%ItcL#l[ҐbVɩv5n]ܛIրv)(P ĺXCJXV-iHR2ey7+=Ψ&dI"^Ng .وQN 1;.w֫,cɹ8!C4vDHF4v8DȓdiOVʀgFXB- -i!H4"%%B ӫ^\~@HĝكϼxocH(I)͢@X)$z@cS%r62+QǨ }x,+V΢bGi UA#1ILJ|6wWTeqS];H̛}Zu<;}q=o}ז;wu?l_dBp!*Q.K"0^0A߇ ωg?*\Pg jؤMHi ,$@JeL%LQB@P2V@wmKQD5HA)ϐ9H|DknMi&I:lCT/5Eaw:|BUb٦#A +H])gME$]"T_XfÈv=͖ȩ׸Mcc#:'`hEtJ F3VT’x* c}ڻ8G;clH=AucM8ݝHC~MeFO=T-!a$#M64'lʶݏtI:k?JFhɘ5`r$ܜR$̳w?zO(R%,U@TkZ3)bNmKʈL ؒ( #oCCȌɤecgR!.JAz 7ضxfev'l6잓ןK3BNͲ 8BGƦUdm/Ua\}~U6_?[o됟G*<||i8yZГ q~??MZe+/U4&Q-E/%2zwĹ?_޽_cwڀ1>/5Fg-->!{LC7Cc>I?$AY z۰v~Pd}3/>:?Z̖k`/ohM }0ÄWjq#bb?z'޷{H3xH:rP-R 4{l Eإ˥9eXrRƌzf}Q[fbqr5rOхt1n38wfiu:i?˳:Ֆ?׾Iyp#Mn7v"R"Mj$hrfz>dr<ɽ>zg_OlG =n'7^>5>usw|b;r-~NƵn7N}8w6{ᷛc}\g4/-Nd\>$DߘEZBj8T=ZBO`ܧ]ʈ}Ms0Iu%"d@c @ YQm|֞X;u~SK#[^V/Wq""SBLg,dJv޵q,ٿ2З{H~u||Hˇ3|J$ECFe3]]}t zmJyk*2b9cm>k6 Bˮ!VQ&G*6msE&MܰSĐ@,F*1PZW2 ]s,ɖJQKwWkd_u.?(1#ƹ%|T=2HGxI:M8iFXP'JF2XO5 | =KFh,<=S Ez"`H 5)mej'O?ѫł[#[N7)C.X) \Q_٨.Ӡ{2:Z,U8O9EU~`jYT<*Lqo8:dsaV訵  r""F 0p$s6-/F&\2< 39 M0B ˜GD+Kx@ֳwg{6HcɄT8U6J*ЊOfk0.M;jeDd)n+Ei/ÉN}f4q+z"54LµR~ju2L]1)*# 9̤ o6Bk .^:n[kt8O ,vxrz"Fn:F o0ǴuC@םO޾l80E:X3)]c9C` E1\}y߃8ЄBv^χlCeOhV h8ߧe G_+loͦmz4ch>dW;a4M ]!hژϧO~:ky|ݽXP[<7ܱH#7KhT~xà_o_*.-s{TaNK"3$D*˦,G^-ё+-6X#"ƌM*Q0 K,5^#o7*rRouU߳hmg<{NKm@HL'JO1c*'e`Јw##j@/i~SWrҊ驷!/{cO7>vZg5?QVs"ng5`~z7'E8o;Vr-!&s)ș ڠrjf:DwGh]%.KHCT8ŊZaZl<f l;պ:=V)n@iU"t~7zfOaFhNPƊ]Y'xA;vf133W}DW-A?c}ч3 gvZg։'WH~j'\ouwHWbes5 :< sHSݸt^'[W M8?[Istd;c؜aOhzc!E&qz7W)R& 1A}0sJĨ6xΨjgn/FW=נϙ֘0zR:s4z 4Seh 34F`r0g*YwNR=α1g2G%R^;O`v={*Jo$fU}7ө%Sۍ(%^(GrcN +<{jOQՐLBñXSRk ͂Wc cU [8k&hG>3TǑnKefFA0"k4!(fZD0 ILH:%c3V 61*u"wa\[V|r`BҢz U<6%y:.x )4Jy"*#k$ a 8ޖp~튣k4O~hd GʧU`@|''K`d(+hTM#CvesF+9w;!\ ?8f˗<&}H>t; iI0E下/Swlpm[7@qh|tm(zafO& -,IUjSoݚbtmA1=XO]Du-X:n)`(QtsiX諜 xcƥNnBPw^cązb#+0w70yͽѨtVOKkgD[yMqj٬c@MT)_vl5/ʖzڲ3#Y.ڗg|m)ChYMq> kb|qM˪!Kahy5!$vNI#xݺ7tZg9XrK.uٻJZcIsP;==쇬'gr}C &0T<Ġ׏mw)EfP)A=!- Ne~|K,ʃ=̍QHx6UhLf ?"/^|*o?Շ LOgv/ sD ۋVmS_Pjo\5UlG FUKV9]BO &8;(6^"8x~e&y_As$pWHzUA4O纄ȈEspI/R pQtqa{0A*cQ@AZ{bs{FpbAH"AN%{:{[Nl׋+nIL[/xUuӄp팘.Rc^1gG`ەpQIEN7oL0BS$_uw\t5rWGl_߾NZv+8{;)T($dZΉ3:VĹ54/}m0µE#tTt> gyqN=T \%q8JrpgW  @`U!\%q;i{WIJ*jzpŸf& 0? *UV}+RR'pl3R,杁=;.GZ]N7 (;e78m dzYSpyhtc! LsݛXi^``:+ȡtV}i#L W 0ŐUŐ{fR%\IIW ćWI\tォ$%5\=CR+qHpk \%io1()FH WA|ꬴ:E)FYI-b!Qt:Qˎ⸸{ .D9s+0FccO1T=Yj15Ji+V`i?}l:4ƑۗGQ6x|zCc'qXfZ{wD؀,'!bQ yLO!@Q)ަ $3/sLGn]ӝ4BnzǸFcv{qI&f`[V7weB,uKH}K< kU42  ,u0Δ F(ASE -_c> n~u[jF]$aLឬ^G(("nv޽?F >*izO}w (NA)09q w[ l@sßGKPi5o|_׼5o|_׼5o|_׼)HԼ5o|_׼5o|_\V5o|__׼5o|_׼5o|_׼5o|_׼5o|_{:5o|_a;VSvƝ26'ȹ].r+X̵Q*g ܂p(&u"‚HuCG nrIE[Elf񠉠NH; ֟y 'aTXgFDr>GUcR;ĬWG#A2-;3XbI~㧤b" r(XXG]_FC{u,z3tGf㙗;z"HNμ#4mhbPTH\Aפg .XSJmΌ#dm]an7H6IV^Z3$s+0Ek0x۝*UnmRT} l by8F.P ˾|m/YI_'.Vѿ^Oի6/m5ѯf u֞yZ\CK?rUw ֙jf|wZ'm=\ɨ &y>/ݻ|ޭwڠ ĴRt`#"+ø:2Q8r-x&R1YΨ~X Fi6M6V. Aq͔p2Ry_XPܗBT"zNᕞ]ksF+|6y?X庵yl^'w?d]1E꒔=HQI*"`LR pڙchTqEon"Y7 a;%kNRv1ps$t:f#BN<rBnU&,px*o۔׌oI90]6p0E)2(Qtl:܊BotսVFyrPwWVjGD{[.jnLUe lp 5>&.(E>5Ųf:DŽѫD5'";C ix} O7g:[Ihzz,ɨT 4/m%& αT&:g db"!S:qח( pݳⵞ ^o$7::J)$ۅ+NYr* i嬑2h &`δ sIy J_+EhAg]v @Qב( 5,3^ˆn}܌ߴlTo?ߌ_gM]ny[m~I![e9_Fp7ha~)U]xl1vw޲$fGo Ӫ!Ԓ証i[ѾldNxY͇ '&ִlСft6,9!f_?N_8FdtwRʹ*b@eo q<2&d4^~ /( <|*%V24!}t'x8tV߁v6}Jvîzgd;_nxJϲ!Am-SyOGƯ^Uy )Z$WφP!}&?dl(Àǂ*/?҉M'u' u+rC2,dC@Aw[HʪMz= S|>,REv^IDF}K0o&=ްRET)L1;e![$=Tzܹ(Ɏg_/o1wn9MB]tR@.z@"?q GlG7\[@29T e109F%GRrcz+;~q{yD:Ilw&uҲnH47Wԫgځ <$LE"`MS|5rJE_6>2cGoGMX~:r~r~MCot||4d1=ï'Ek?,Ξ~[۵r?r1Q~! ^xߖM~k! h'g X~M>6&sEPz$muxi9Vz~[7X eyugXCv͖[̧QCjT$]\3-\_#eD\hu1'ީdϋmnJp2km[5UM6ܠn^UI)ocqFLܼlˉs .>[v_i.bjupW[Ƈyhiy5E)hc T JZP_uҡN[/y EM(QV;mt= aV cErɞY(ϡSdREM1s渏FW$0AL63Zv3&;ąo8õܑRT2 },tO yWMեY>P9ɝԤ'"SFJE*R.@y<Rg鸃;v[23Vp2e%ڻ , PޗEN@=uTS '\thC:zyCRt1<΀`CΆϊ.?z׮R헵cTJѫmSw ,w "ibvWx.㷖sC9rkn@,f]B1* T|oub,\`„u&ܦ- km&g$X¢Z苂~2H;|X:|HZ77>k7ow4 >F|Fo< NPʰr/%ϖәY*A(oIwoP:ֆ&w6*]@_of[|co/!+7M]oT5 +R9*yA3*WUTd :* -BT"ؚV:ҧŗV撾JFYFcmq$O`IQ \΄LNyƍ;+s*uSUS{6t k:_7 .?BNm[v{xԓkΡ^كퟺ*FU R*d+`)r q-.ā...A. F2AC.6f yF xy `:Zs!(=`bYP:e]QK:b!X/L殧;@ @Ց&I6&}O/9O"!=AL]w1!!ίr']ڞJ/)S)SiXE^W>YYI:6 z~(,kKnJ aR<%q26dTQꁀR7jk* 5=]8 No^,Nw' `@H"#':LޡhA,_^fmi='SJ6dZ!t4HJƼ5+qt"}_~qCLQG[>r+T^O|§'YJkR*j1;}à4t2[r)\ȗ@I! <2# jҷO9BQL:pPrCJ%9{D!knw+!=A*'?d_WYWq:K `K4Mi: %Fig'8t+JhB[)%1 o6\+> hqQI&id,uB}ҥg0| 0ͣІІB"lU* m^`6.osW҃ӾsK 5꣟>nt>l<)x,ra1d!`ݳM{wjl3=4ȟWQMߑ] Z_?|׆[W/G'ɌR~AGM%Bg *3Ȃ}wxOk}!4t mZJhBx= ˕Y%RRꄳu l5$l=b݇xww\ N'R*"MBrTN$&SfAX)JD+Wώe{6ۤqyt"NN:udjX:JUjp9΅RF`̨,0(5B1`Lb[FS+1(|9Rg̗9Ts K5?kņ@Ȟ(F}(VS5xvڞ-2+)ꬊWw_vzYȥWwT⪣}M$//~ܺ@1F$%%GxAj 7 zLsJ=0 r"yb%LrRlo:m(-'кt gS6cJxA"kgQP2c=l8{YW?5[">*SR19kJe9uɠ$}t%L`dA"0o73r{#M"LJ -gPF3ed;~ê֕L\7^kgX9'#ZHLm*EX*e~. P3ɎoYd0Ȑh /99F0gP$L>IAzt+0VhGWr]Pk'AOűFцԤTm/j'߱z$$d{"dlkvS1_xh()D ^!f[CeYtZh,v))()l'dn )"vfȐUL: .lbY275 cj!!6e>Yi C`LӜnshLwƩq|:/͓x !Ϳ RMD' : I\fmE2[(4x)Ֆ"duәڒOD$T*e )5DcyHq Dg29rD.Iĕ#"Ѫ'?|mFK*՗z#q<];wpIgCA>vحFl?gFfL؅wz|5N.MJ佤MCM !tevȠC.64|ML=g޷OH#;-ui{|;xa;̼6~~(]7Ϸ<{v̻xws o]/2iW&8+\>r11(mԚ?Fv{yn7AҌ1]F7yiw9L=!:ѕ'5ʤixsٟnZ# ؃&`Q"8g?uXR9Iu0,6йHĒOL4˾Ae'o:k;OFxH9c>ۜ'HL:yr6:{.\D]Ƭ"YϽ(U83S@f``dV/ E!'!1B*i-U0* j5rv y7;Y&_ǝYG"+8Oذ^@TӪZy{KhE 6`Pѫ6"R5#&{xޒ-.ǘgV!m6fl켈ФL" 0Cr5[.R6]לu7˜1=2 (//ǟ./Y}18.AIM '' (DHΥHDT*iez (^HQԦ&.L. p !eݥ95⚋y4Zw쫵if>>tPpc$BF/ev[NOd烗 `.UՇYi[K,CE 隘X(ݩmd8cV`ٮ{]O.b*h|ZFT5"44}:nS"'$ xe!xQ1-bfd(I LU5]ֆh@g NlI 2Kp‡ڵٮ?OԞttLj\^ԕnz{+iMQ-=OBdtd @H]0)gjB9xslz!jqTX Kkُ?(0xh-^JKpiMB"ʣgIfpءԱSFD zޢ%BOu܉ -$BEKDH8whti}, N TI%!v#lLN2g6h(bvu!/#i(i*C*>f2i3> LF\WٲFΎ?9G8o+ڿ_- ל{<5x<8&"29gFEKCJ ?kkxf(1IE+cd}) Z x EFjßW#gEkɶ0X)Ŧn6nIyzg * W>myo&lq`r`x BG-R=# cvr>|MG~߮3 :qN;KR."sAbSΦ4;VᐇGz2-dQ_^-R;<@\.~:V B\F}EH;gH2% @u"+"?"4^quRJ:9+Yc栕%KlCVYN8TyGJI&vg3'Z(Ӊr9ЂrIv졅Bb- C pBv'S*fDBZ#;3QiU>GueQҝ*;q2d3j8vuUY+r]Yv.jW8=R|4~Wr6e]fYfzVʐTJ&(rpv֓kY^!_W^+nwGV7OG6&j!ρp ebtJ.5>1fLF ,r"BluV2upT6Qûũo.S.f?Mp?`9hO96NMx?vap[v8`ɘ\N,(ǟ-[\̂c8n5 uh !|7wCn !|78)Zd!|7!|7wCn !|7zwCnX !|7n !|7wCn ņwCn !|{#!wCn !|7wCn }Wc_GB,B=IK:̺| VG lIc=6tQ]3OmVW=pu ˜,RpA]X 2 j]QT?&hk O!j,83kL'cNC&d)H^rBDjPC:Mb\d|Cq!qܭ#kҫQ6dy[࢜J`ߟlvE /0KHۺPR* *sI%ykMrI2$٫*It2(,Q( W2kH20ᴫdl,c1NRm [tFs"*I2tU#gG9' f0*~ Y)'FP)Ek42q.V\*%p~ %iZ c&5EBQ0[( F+)Bụw#G^*ag!LjV]'X[@LðdHi6ǝDɾqSAm#3L)*cfWǒZ'Ly۝pq eeDcYtY+%r%O B+1dY(L0b#"D$u{0-ר" _|v0Z3=*<$rvD3l騄9D#{F&ceU'jtTc9Aec xtUg[(zC,ZJ6qJJ6 ojZ))Q$[K)cM5a*x4[JJM٧ȘX1,| !HV0eYZpݛ_^TBRdUHy`FA_8i'S0DrLZ8]{fA+lV#gTEzDW ]BR  ,YGD.2<KȌ^e,Я 迺>A*r )MC[ETIGɕo>st,֮FΎ5@ߚ MRGR %I V"X2L?fil!%BV$:$3hmW!cnơt`UMB'K(3TmI^x[csF6šHVn;Uʽg@g.c.q3*oRHL&qE ڄɎ#H<R$RuӖg]}-з*{A-X3hw ڗY\prQ:e= RfB4PT w<]נTZ;)߉wW~A2S ڐgdP.h(!"!6D9^sm]%>Z\by>(cЗz* 譯T )@c.zG6\llWlN@xc֟`y hج#p:Z -/|_*&@&_%IJtba|8Y_ӑ_pL%}OD0-N117z#WG|o~0jYohF}\i&rOa/g~wxmዅ1FTҚ@hIH'wc Ftz-/7IVVր^7,s ^WXg3o,jyv0I̮'gJ{8_Cp'6^ baQ$!-+[=3<šD9c4wUWU?UUT^UxaDYAu#{]ߘs~8tA$,e(lqe~ o~8PVD'p#HV4Axz|Z7)!>d8{/K8+~.:xtL7  (;੪?% .~.J`> J\J V@TP%D]#Yͯł%$S~aٳT:ST^z/"! MOj}8M[]k.]?cŒtD͋W4_r\U<ۑ* ͝t1gT\9ksf)DɃZ|x8ačo_s3o1ITY3Ua}0~QֆwӗËbLO5[J>~MIk=I 9^&+Z1{2&9L𙟧gч/?Ί| ~OO^K?^r>:heۤG ZOfJ|(jR@ap:Zz4@{}Ww N{MymbM՛-arg;ˇ1k)D3<;_\trЈ$0fRSRSQr3}gx$ZkX7`<{W8 $U速I 8@)C#E֎t$߯zm;&1-׻O½U{Д^tMQJWI#asu?C?eyْ;>-1:d.9sFL \Y3+WhRb$cT@iǵXQRJ$6{=rdp䎅}"hrDc*h%gx0 ^زZwn/7fxd/ޮs!Ef~h}OG gnZW*вɫ$ 8pAg.ބ  beA [JbsH2[w#2#ɭ=sURBU62 EQ݇2YjKΕh.3\zgAUz=`S+PI0E`NSw`"A1x~:W`_E-,!bUYmK;S!`4wCbx>ux.̓brߣ~s#f\ݤ4즿20J`_l9q GE L~VFTΪɴ&_7o/elCg4Mqn;9f.GptRZ_Om? ^0ZR%7t ilfʇ %VAg)>/g],&[kljX%e [-Q=($5a0Tu _3Ybg꾱RR9*SˁWt9sF($'$0>TH{E*^J8[*`V 7^g7:"ׯ:~|scL^xLK9@߅_z\iW?ݢiVެi fQ~v9js ֮bD.ud!|J"MtP^D9b[UYV:-u$ieGL4!2"z19)B.1ӑ0c.yI=a{P ( X ==Z8g1h$ӠݝN+w:{ڞ_/nٯѯ>7;{t]tl:v@{SSŰf.-zӓ W 18O8ic`bҹVpb#V4;u]2϶upWՍuIꐞ|$*R1qV{tHbaMÎD0Tn3昫ux!Gnې+ h(8 laP Y5Lri髵mp6^ d͘;vkTgLv kfzo'\fo% * 龌ɾ ߉z#a8&z~PTP(oñ4ư rm17; 3ur܎⡐=. ٣x(d/Pf]PhI!qe >~53* xVY=&y=?ö` 3PlP9 & ?enZr'{փ=^ƐZA)o< >I,ʄ(7[E{& t ҭd]bo䆱ńPs漰4k\>h5 g3SJǠB;bbZ#MtXS aҔTE鈷⭕}{8֣) <\п53t-tE] A1d 3K*wϛ1xԏ-9N9v3|%g9J 3s OW<1T)b+tHa#~rVk/;^s| qk207cE@{m&mCrbȒ{h$lɖdٞE]{<$d ;=HZG ;SFWNz$Jb콩TKxVn' (ПV u\RLy )PwQCCy+fcфs/%T^-bPoa0"D#gN̄LAΤ\e&~FUi9wA>m%h<{8ҊIS}GahZ܃ksլ-=4޹Qι):pF)-+lhk 2BBWڦ䬣'HW "T S[CWu tBRttJ2ed+\h:]!\ 2tB'IWr~9k}j2D9.;%kNga,e"!'y'yd2ΓhtNGאXJIV{lM#`-ZC״f7?vtt6HӚ3DWX,m ]ṉkRv6HW * ;@ZmP ]YisiZCW=thM+DUGW ]-9?:]X"l7vC{|j74te;ڷ)^&} tBx]=bBiZDW2Bj<]!J-;ztŅ;5tph ]!ZANWi;]KWG6])503R=#S,'\oku-x6ʊRܓ C6<30ʖ8h_46/CSVȣkZ Lm§?WK9I^G^4ϒS< VC'tyɼRkKRAyIXrHFhlx?@ԫ~qc l%XVVVd@& Ց]Gai"O'12i$˼`掽^>D/StۄS'fNN^O=3%I9RJ yA]<BekZyԍ>h} (Y!mM iaGԲEt2+n ]ZCT Qj+4SEt9W++[CWV˦4'HWtߕ$5tpm+@8A.wEi) ҕX"TU+DM QZ+Et-m ]\NZcjGt(yGWO4D0]!\ JtBݖ]wanvpl ]ZOWRtHWVHwe"Ptp6x'7Ԏ(Uݲ1 𣻉>bp'7*XQ"sɖ)i g;?gK4SQ˼y/`8}Z>4&2? B9}>`H>a2t8 $i%tF.Y6a}vi8*jh+yZorCC7AI  PSR9^P֦K,РX,,%t: eg'P7M^yn/"%0J\54͋8Ma5NR\g<.2=A_R.i·iMI?K-hEC(WGJVJ̥&!HNZ\{'oԻv=&u(DwNF}]ͧw5O&",d+NcfqBfDI4Τ3efR|v(; E95Y.A[.)Fgx&q.'vpmP&x2rQNJkɚtE$e+RQ?X}^:b| zqyOib_ϧI蘿>{@5;VFAA3AtN2Eȝ7Ʌ2x%UQ(6Q1½0r%1tX2m$IȍϨj&'N[7̱抈<皂~ *P-r|BGCaڠ5g%eJ:.:R9˝γN ,dgs4xbۇ`[_ZbpoFo7Tjx>s'~7N:߾};^<\+qkeWtGe_}e./*zgnG1Q~tLq hSr9<2ZkQ_ϒ_ӢWo_O+xz%ϑ<>l :uSY$yE?YpH oA,z+b@_UúxO7K-l X" 8Y}?l\{PI? @OYM<塝 Y_&Uo\<<|u˜H/_b:M>mF+N?_AXQTAz Nz5l3b9? CҤ3=c0( Zo@T-tko/ [\Ms5]x?`jJxϐfaJux/R ުDu,wqÇe;a8538ݤ kZn1s'7Opl+/R˟| h/Y (e?G+z(ÍZEN8BM})=TI[;0O 0RʠY\@]<*\Ji9`m,SuTF͚#Su jöj1\n>K~R\Uwcz|0\$T'/恹ju/6VەvV^vQ/OR>A(JM7\;ͩw鿆 h<:JTxFXa#L.=ş]hcH4EjcnҚ9a-h 8K/ba6w1@D*Dr10l('?m: 4Z}V-74:¼Motm{,M^W r+ؒ#S3:f 0bxD#8"54?s~'&GqUnn)Z'X"ESA^iuxƜAEjY)Ң,$/"-$嚶Q$[Y])P@T0H-Dqhq:* Ji\R"uVFibTL&KKN(* AYJQh;M-גK直.Nbno%[Uk9kYDs /78`bhK~+ѣ6w{瑓L{[-<A4q?8-ȤSF{˶x  Ҡ(9k  fgb?, s*(rUsH”DOU\VӅUsqx KTXq<c/Ch^gGOք'{eqdo~u ڂ*@Uv.21dI>@TS܋ k2yv%'#%1)VB(:Y\ `sNE%"`k8rUYiO˞Nqtr@gx cvvvF a5q-X5eLaP&i;뗲v7 Cޝ.#!DDEC6!b̲T7u#Si+A)GQY<5UmM.TM=gq,N"ڌBpk%!}iI +LVء$.*_LɘEpRb3N6"iuF>e>WW.B'Dڭc#B쮖_lt:ztږCtT%Q[5(7M8h`;`QiH!Sʈ =q#9#CGz' ̞tpk-eܰoF`UBM 5XԂ ]ѡ!m lB*QP`k#GZeHꔜuތݦs?: =\_ ,E}EEDR  j1%x&ԣN2!uW V b9!9P>#85ek(UYM~ g}z|z||V7e]{5EwFSqYPdOYRn=:v?C\ɕPT~@1N&{j0evpfp[7EE 3U] Vײc<ӆ#Q[mFإdɾcu' |mnzύgzT䐳rp^i:GJ)Z=c:qT$'gZK.#Vw3fv.F-J"yjNΚ-8§t@ ~f*c҃3٬ @=@s0s0ѳ0T5®fd-j!uka*6FGɗKM+Ktt ̓{#Z"l7 yއi\ '2ǫ`ge!!ϏVa8 QS\8`^L- v;Ww0Dap(6h+9oTbB(U6 JTC!2ؚx[V#A{哇\I%ǔ 24UL-k@ɺmg`m:[iw *ݑ>=T}e-qIdɌٚ@ \j*j$K4;-9\yl-7jnr@jvә%6gu ?Ĵx}wG"gVsvs\_/֢!9%j6z]rPKL#KcQظ +()yіk[ԔTKޗj@'WjHD.Vk"abz[nӹ2*TB>-Q7w-O97Z|Z!fgGbj1K`7t415$AS)a`SD}w#({heRF7i2OP\NP5`knlk:[tjE"N.!91ba"5;RRt# Y;V"JY3PS-[p}Ik%V"v,:]\ +ڥnRPh;E;.%.؜%NQEe@Yt="Q(iȚRZ&vqWa7x LEZ|ZG޻*_3Fd? n~|G_K{|R8kVx&>ypDӈy8M'yDH)Ťp01![r@JMS"$B"v"$]KDڝDͥ"$@f+*sȭA/JDT9l`>)Q&BNRvC BO]R}73@5uY/]?_x"4no ,1 !Ǐ5 F@cbg%߁21u!@]EK\@r.iݦ7QOUo 1nL ŮhIg(P\(Ò-RPʃbjgcҧvskjJT;ԆՊP6zJP!VV@#$,V9d#T 4m+T.ls(ojاT d 0al-A[N]m:[ ! 㕵L/G7yo=ܿqWښ7MW3Dr6(4 9ŋ m9:H54(n]ͩQF9E8[d!-X'O'=;T-T0ŠjAᢝWZݴvʰq ^ճoCT5e2A2V픮!]Pm7]W8F`:Ylg^} lww ~]*$ph4͡X7n4PG[K)tHͫo_ v߷rk$6^뚃΁X v@V %ZA|6%)NzY(XDJ(5"(q_DdP `qgop,dWR^S^3 C6n:q1:J{M^+s69/&R}65E1.NkWDdn<`y;ïMD|5`QzUƒ")NѲ5Rų|k(Ljq8&`lIC3෸zh5Bgڛ۸W?Ij3UN]KmU>ɐc~k`fHJ[#<5 FO74)skȊD Ei&l5W`,FtBRpt2hJ,)N8cHJk lO>$MHXEb&2Tp"=4"_\:VTs$2֝*ls _,PR7 *@RYU4c,*%fJH%1Q ЊrgʌQnd>qc("nII c0Ҩ`38 2G26 .}ɵߗ-kOueVp݁K=t8KS >ER]*cBǬ(O<9494^dEI5NJ-BJ5ӜZ"N;Ppj$N5C v="p5K)ʹ}ϛ/3욮Sxu8#r%i]MRVwz q1x ;CZezuNm̮\!,eynW-gwP&,lݏJ6r& 2(PF)lpYβqٿ9zrV'֖+<.ƒd {6}nMoZVE.˷'J»[qFټΊգr(pQu~V)'nh6y[Tw&W38}Q)W#fTDvqNX\FSRGQQgu򟝷^K< Jky0pb$X;\%*:Wdí'$Tj >2\mE.qj+j9EW[Q)/"[9ծ[m9SV,Z˥f2^ϊ97 6Z!Z_ο-FIj%&%{a RD5lۻ[Q>wB?L'>r%"Ӊڇt#L?A&SÁ+ }0pUP WOR\%<JJt(p| H x~ \%rZ`R#\=AaE`U"CD-{JT2q'WBKs`|WZ •BH"|@pf`*R;\%*wJ#4: U"W1{3j#!u/[O; Y?Y W ۑ9+j)~${vz4 X \%r:jc}|&zW L56X1v׊+$hR|@0M0E~Ű L'jk@%FG~0M8'rG}D޹堟`LP_~ j2urYR2V͊yn'׿\OguuJ`RGTRJmɌ6>imw1Y.:K]"k2߿Hl2PNЦw5rʦ%#gği|;kH)EԒ)E ^.c.ՙy*s8a-|{yg\=Sg烫5)Qjx DMJg:n49+6[<}7>]':^KSm=e=:qƞ9^^?EW( Vp`.;\%*>S+.wA 0 ~_~tRЈT,sVQ#1LnлOg"=0AsΫN&<ڹ|Dݹ7H¨UFu5L(B\aukA1[;{λ)ҷbSܤ@6(45ҔIҖ{t_BVtsMF@gR>=U &)l{ 'I1E-Nr͑=ζ=*lLtN}B7gu0 7'E\N 4>+WL+_-ۣe/R-{Cry0 8_=;1nw)~{5 V\Sxt} :S)!&K)4ćZG0H$@ٞW{WIQ%yD* !q\+(ô,-7*6 {j%y lu'&; ZDl 20G ÌМ-Wok\ާ*l*aSxn>7KϿ^]Q٧棪TJk)trZ7\Ss,2ݲJ)[Wg\&^1oAm:&QPXTDcjè/U|r+|$u:`MW;(Yw}(U'ql^${Ao3m`RۥRf+eB{4aDŽJn4|d?uRq-%c8F/Q܀yHJJ3?^hYnkgx7Zb(WfVF,ZF4>"Ǖ#sP$X0S P[u#ap) P n\ )4Rʃp-x5eGyklh3Lt0؟(A4սsy]h#}7MݧWNZbotɶ\31Vɜ[y9D1"¡(DsQ28C!m%JP`0ÃR˭SbS)T`F;aQi$$" (O k%p/V3C%g}ס<)Ve2z%책u7 Oa)6pkaK^vFC5,)(?#iAKQǰi=spr8j@i!(H- RSo4v cF帧 F*%ӦJ[`@rJK)0vŚrX>${>qxh\[L=-3:S؀ӏt[ӥ΄ p=MÜ0}{!'' 쌰U_L\ثny{[T9HB]I_%F"{)(ux.m3^vBOJ`Ut†!L@ZX.*,&*}F&_úuݻ:C_}00> Su14FrmϚ}fa- 1RNzHnm#( .ӉWpN\c5bo3[UZͫ~p>- FS1cKۏjes?^MoZ6R[8G\6 Y: Cˇg7 >Uζtfdz1Ɋ1–JQI.u\%eyqrFBȥ!}@wp1 ;ճ*~Ma(wKǫOEw|p93$pͯF4TUJd8j~pW^x2wW|޼zW/* 2I^pg违кafhCWUښukKָ9]VI)y?Yr`nYdCNjߺ}>Kle# wOFTW=(XSd=ǘ zT~ R6v64Rf>:>5=n-t`P ( ==Z8g1,iP`BDN+w:0ޓ}Flgh'n^L~[rX~ÙsϮ7;{ƫwnM:,Pa;/zN)zAɕv%'gၱXjT L QL",T398)hshD)NH; ?KC$JόZG|+0U= cR;ĬWG#A2,)Oc.*kmXE.~1&؜$5ErIʗ[=R$Jؔ(y[gz=U_|]E R P'Ua5BV0tE[=:ZKӀg&tx4Ah9eO5*X2#KqޑÑ6~.;9ŽzvB̟RFK{]x}.\,.mO ۬$N.//zm4[+H^n>z?m O}[{}6Z.JOWF{_Z}1o8k x]E1|P9'5k>蓝 k+4Fݨ(qxX%BeRY&-(3@jX ^aċwvA'Y)V5fcdP ƣ)ym>Ea !aLȤ5f]1WMZy\cAJ'ߵah]j&0f|ZA!\A|r /[z6Z9D$Nɸ˰[:\7IfPx/)i0o)>x2)y|'.u-7}5΋0enYƻ(}>U j^frե4w(V ժgZy3+Iy!3%QIggҳ)+8yݝ;Ivdg-Lx௺'m4iXvoZl}/=f܋5`07딞%jդ{EtvuhS6bI!4*H$]x>y\BNv &͋V*x`hi;o]ҽֲ<^Mׯ\('ߝMk`1υ>0۲ol2,B$kl0r=`7_χyrCl!Bƺ*£UGTޒTc_'hܫe٘k6_/;.8f\2&HLX+{}}҇^ʳ`$c9QThKn@DiE!3h/ zpWܑMfPEm0Ej|쾑LS_8*t>Fu^Uf7y.]߾)^3g3Ҭ&"FIoj (eB& OrI]ͲƬdQ&iҵQ`g"I G䝘%\*>r&;FK̒NKp1h]T?W'sK!=dBڻ 51dRy'24" er3nYC_'U{)'۾&C(F]?_ܱe|qXRwm= ߇'(<霗^ diKhWWqXFlXhJl%r#8@ő'GR"UitB Dqud\ iD ǐ9zsWLWRk.jJN,{;EwE!X/LbWՆs+l͆~"s [oŰVx\_ >z0QkZA|GKL Y%8t깘O*OLFKZx㓕ܦfAg?{y_ \2kb&p6Vr4I - v. xNGQy䉀Rt2+@'YL u[v_ݎǁ`@H"#':I&^-MCt{nx='SC \:_؀. Bh$R5pnkqx<-ŹG_xŧh=޳| ϒsx*w_'˧ASYRx@%T@I! <2gƁ j_1x s@L:pe=%eOJ%9{DB֖эMj7.=z H%iT(`2/6k`H%O5w'PћF8'A4 g0{Cw}oI8d@#]eC&ZG؂atF8a2U2g)ROBl)|zK7xp3lL'۳R~CAa.0,֟P~qdlP8Sz/J .~5AL >'ƐmqcÓc1A@lOݧ>Ҕ݅[5>0楍+e F Bq4Z9˹qk"vRn{((v~\|M|HMÑl˹KǶ&V&yH><'REZKN3"\J(ŚpI_J$u)%A^Dsfj ^omG ΎQ力O|ϑ'Mj/OFnwvu_A>v g;rr፷9ZdFOɇ`U!QtȶNK'ɂ}LQk0Hs!?*Vc9l %AE']\Htj,Gp.iQ;W8}:hpa/ۺ\1>eD32#ґx[tUKZ$S4jb#(zS䬒u!T@) Q2*#zNc6.ղԙI3HbK JZ!, ,)[9ʥCJE;2/#/yΝU6ڙI! lULX> LURԵ#/>/ fGwO57EpՏϴy;kǯRa^:&RwN#,uZp|肶3$:rd{!$C[IpBeHvk)>y)2HacZpɊT~܅T ]"d*6s !-S*UGtY$zGGBRwvz8p=\eƝ)y;SKVX(dpVJ&ʾ@?TT(R,I#Kߑyc%Sؙ3q"ˬD;>Fy胵2HZYHV5 IGF:B O=R.0.zdVh kOŒqp$ث7+I껵nÅhSٸb0c,gG|v]ˮ&ϫFj^[]38Ğlw]7_Vo'gYk?IxˎtCw.>ϗ+}~:F ڛׇ]Izk}Y|: |>[3{w=b\H{(M7'C SiUJ:M7iҴ"%M`UqUCttP:: +MN_ڍHW S+|xtPj{HWHNt :̜]CPzqHWCx#z?O04]. =jhGW %K+k ?!j ]5LSӕ5Fl;ˡ+6MWN'jh{骡]De`f5juwcb^%my$/'iK!%DiIzI~k[_C՗_橍:nprvI.|}<@Q+ڝӏo_FHy6%Z(K#*U/;ޫj>,*Ky3&2{?w켂+٫ 2Jd 2BI$|M R=y#ñ];)xHtV+",5 ڍ؝ ث*:u(xkגR"g:.zYgnik]]Rs/EPM<:wb8=]p7m$>tI:!>^>{ʝny8̘V{kU}ծhE?\EM,/OW;&GOM} \u)h]`dVQjdM.HdЄu6?_\!)|Fad e~@G-λy.\¥`U 2 eSq5YXDVm42ICxgfRŽrdb$]"M(CZ@1f2fE!7;[QI#|YKoEKI1r)XrE'})cN /TF\Yh9*;Mki9B![՘Җ-e.eUMditD$e 7ԡQǘii{e[![nJY kFYMmcR戈$%|M1a9maEK3yRob,3àJd݂6LemShR֤2Cd]Ty ⺠1kC;Q5Ip6pPR/*) 2]C=]/xa9@=@'ת5Ȥu@t3xt!n!A;Xdu,©Q?z}%:9.⹳3ʃhɒA^eF"v Z~*"07?q 's'2)֪O "M{Fn,(w.Om Gxq<IrCtTw!|lC 0 P}{#AH S!}@,R򠽄mSBFkh:'!(v,h AE/Bٗ>AcFXXw=}k,r3  Р :4qAcFfJ2RdA1!d@rAVCFdU9\W|@,2^$EsD;k$ct"ǢM|ۗl+iw& h%df9Q-{(5^Zz+ڈ,~XѠ;h!$,BG 4/}%ж41dKW,ummkܣGͫDY}/y]-.-se&⧢A0loNNf1z:wѣB1I@)vfQ(p[SV4ךB yn 2?Oacc:' 4 %xzږۂPF\nD<]qsKn}Ї\N3z*D=2HeԠʱE/m=Ƞ "!Q2l.[P Ŀv3C+NIS:b,4ȓ;s{9;Xt >T@0\8 Re"i;cGQ2ÃazށuU)h~z.ڈVbPQtA1sjo;X{uƓ ?R>T;)J"AL:/Q&cha[4՜ sȟk&/Y%ÎA*DEC&l0%v}jU ȴ@b " He R 1>!+y!?mS[A%DySS0 8KZҖ&n͍h( LCb TR5Xf=\"Ir;$9 2ބCtڏ.XНn\ O $k|F:o;34BޙHXx!Qf-_ܭmf)fo,M1Υ(\ڦ"W],OHN& !Vc@bjbDO8m>ߜl=7'%f_lB"+ "iEo]vx7vu:МbutT]n7]_v\_ʟ +A+Ѳ[)trrNG+|.vc-ԭfʸ.j^%mbQGJ΍3l[A5ב :U{vfEE5zzb|/v?:^@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; 4Y'ҺXuT"&"^ D*b'@N"Pb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vM '9HшJ=N U8;L v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'Є@) Ahq!?*e'@Hog'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vh:NKQ9ޫg'?]RS*zy}fA_ܹr~g@j2.ApP 7jKT7.A%a\CD rg ƑʯWME V+U- ^ 9"Iv 9t"\A2\\ka츢q%N"\GV; }AupG yjZ4v5Hef\rpWOz]Lgrٝ 7<_~'.;CU74}cMk􊎫0]xN;U?ݪXiJlM M1LٶMڜ5uڦV}Ӊb)=:ή\ o[ɱx+sۂ8juCPvQnF"WfB{hƤ蚘J /Y޵8j0Mj};Ie bZcWlmW$7ZpNڱTj*+m\p"\ApDWk\Z%ǎ+R%j2&7&8jpErk !>(noN~{-雓cdou8?_귳뫴|ߋ/}?ǻ[YL%]?7_{޿q2=YG<_6wM䟵Ν:mcJMhlpMF䞫/WIO_~$g ^0gV~2TsO9`bMlC=æ$7ZpNT*" _StՃ+UaSƌWYqsaS54$U~9*R%_#> e8.u\ kWRIqF{ ވjpErW6 *j Q8~\`/UHH}tE* f"btBpE]W$ZpEj;RJ\I񸪗TW @g T뎓 TDž+){WOzjT׹sN >r+!a{_ IiaJĴn0 W녪WVT(3VC)U 8X H76ȱʈ(`\MWVJw|0^jpErWP+R jrPSt)m-"fcWqX XP Hn jcTq5A\6 '"vRqE*C`\MWW4NrMW֍~TzN\GV<;}|\  <38HGrd^I)J2ZRY+F{@j ?NJ-Pn-akQ IHh T4czVZ VV++m-"*WRqT+\\/jRJ+u"\c'5LDWVDZTZ ѫpeu)[OtR~T*&+/vGc9Oj` vK 9롂}w`a#fXJ5[Ia*yÕ\IaC:cwEL_ 7h\_Zymtֽn5.*7' >_r}V?y,}Cn|wBy)_PM_uQ ^߶[ \0oWe}}]?fy]~vrD3ooK|ngX~;[]_p泿oRwQ.gۯހ3wr<*>~mcvuDZt^-e5M/mR;٦C2 ӖELmrcq"]}p%Vgzɕ_1[Eo  $dP$ke4!裡eu=g43Tuuw]*53gǿYho1*=ɧHc@D,Fh0v8|m|,oeǭ.\_b&dQZ_<<九v4װ/>ߜp#ڐ'ry \t慞 >~$Łm7X\2|7~lݔOO4^ɥ)gǕ̔<':t2Jr7~hx0tp<ݾu0K 5?4xQeۍ]\q$?\p>rutÆ9]/}XoIN k6>QYIB׳/\ERk|ٿ6BtOfVūCעS[^~ŝxVfst\}OA0$QSoi/Z99z AN=A2s[׿Dv\!]v4'^5 }(j胿ch' /۱h_ -(m6RB[C*hDY+j1+$BոQQ&ʦW|sq\`c8&)*1d1!9o+S`7vfCO;=2yN|JW 协ʣY\du%h>~ vj?톃4wjNqMKR4WDC*ͤu-Ib Qf"hg(HE1 ŁSt4wUiY.`*9*bPq5DPh^Ϧukx؇o:T%˻@!Oy+>*YWP pf¹FQ4=YogsP%rk!*`KA\w v9.B6ƍt,bi ,Zg BgS_@M:ۜC ڤuQ* `fTK|ZA<smh9T~Q>4rfg#+XOF%EPљ8iN/ bxr ,瞧¾_1 ւ犘L̋."J*ѵlK>H$:,RC65*HϞι3"]'/zz*כ|g3K;D=KBY;{LNc4LdH7-<^tc:ICQC{8ʞ̓ḰD+=Y$Mި5Ԁ\tij|/AO;Hrcsw; =ڸy1f{ؓpȓnIr*ˤv̕:z:WVt*¹nrªeb0U @ٓEURb̠MYpWauV5BPYyۚHh|LK*K\4V:of<75F] BZ8յT\UY~U(MA ẑ T1孡I!fL($c,"F !D5j.$^:-.֫`tiu AZ8VPDH3X <DtAE*؋ɅM8xrֵׯyVؑ :^셍k%}!(91j`vPʅRş;NBNhKhJh/R콋Ih|IQU9fPP]EVXX+WlA 2`RɔRRWUJL-@5Q4mrmC9U.l)I96 "x6;AxzkS}|-e%5f9 l͊-dSe^tb,2Jdʼn>}[L)EEb_:TC6I8Uw5uBR@sHt[dE-aT>j]k\Um Fo0 1xm AL$U=+joɼR<;²eo٧6/m wթW%; XeP %# 1Q{*H.bRm/E#_JMIݴ\bz8/5&]Ksk_h"U1R}[9f7қ-M2=/7l _>͙6ds i b@G~8ξuÔr(9Qdkd(AH%I.D9w <Qʤˋ4ga:_%S!V1Cߛj~ƣbK:xg5aD`T؈kS [h ,ǔ1Q:*&7{XS %Άؚ*f!;bRmk{껙Mo췇]OaH6@"h=~h(;cxt'N/(G Dwox}j,h+qM 9kY)EYD,YDOo_ r< )PhmȂJ8jx:[JΊԳ"Sׯ!burTTBq&P4DP,f.rYFPb$^gəDc直No(?]];.~]YgZLÖrR^\?5ܚL͘>QX]|.su5ˏ|Q4^i7콸nc8 baa|J+3#J~잗zF3R-uj6yxX!c\NWt(wVχ3NiV $ "aHA)ǡg @` tuYߛ N[`'R 3.*\/.Vow-4BX|/ l:< d%g{5/7<\Tro̸ɩݦ48ʋQ|Е9*(qi:=:Өfեx2Dku]|rZ_xḪFS1=ӏiճT{v8R'߬k!%%W-V5CV6fG>S/U|^L;zzU6θpuY\cUߪWۅ Oa ߽M|}~u7߽u]&RuJp~}!\i_o~CӺiT[4xo]w-W^Uy3(9m@g8䍗$Wy6M;Pw9_Տĝt$xHzUA4XȈHR)c̄CNJG}v@-ZiƆ)y',E+ 3sc8#DwŁ瞲<AI8hI$NQ#^Ge`jwHpg3Cցgw,4ºC$vFS a뀌W+c( @{Hk-dT7ɭRii !',u!ƥRpd'(T;H#LݶȄ#c8FGPA&Qӆ)?_9pp)# f{"渹ΐ1ϴG2Qn07(ӝJlpӮdQi8XjT ,6(SZ8S<;mk,}/SFS_g{51"R'&аT3/-U8 /2E#TxXM1?vY4!\^ {-@)U鄸MsS@ڥ<դ z)KZ,ٮQ'fڋE:XI @+ǖ=;bm9CÐ@$Lo^&v.)P XmPI)%3ZGL4M0&[ff7 1jeb5^kc编,$l$}i(f=?w}xt *~1ri +|rPOC'gg/DV}?WcO?`5ef暍Vn`[ ẺضLјNǩ3+NtKt#@&W*Ό?G\) 09q @|$Fx&-pNägAȒ DA ,S2ϳF,3㝯ʄQů^ \ȴ.u]'ueNlW<떗UW_X-26h7Cu)?*/lEUQ}痜jCMUz+xfr>+=7U<*@I ({ytX\ G3QN:h[/|? ߶@%MM(tɋRYK&qHiC~tt{x)Awj*g;Вpt)bHQ #l*<S6upcatӸ?B8|dY^-(J+#dję \YجcȤϓuZMDnNiͮco &@Ǔ4?W5 -X F'g*;_}^uO4{A5hۤ5ܤ/˸6^|xN.dzןE>ް/^-eGK&]?kA饟0YX|R0EhڿF¥\`&eR Ƣs2 Uu 8pyK9zA-)E4) OWWWW(W\m#wȌO ߁?ǸC(j75/6sR0{z29eJc@"O7,B.:Vtkb u.ܛNpR|]p"~44֊Kse" >(S h96cZpnNf$705$yzwg^m7&gm];AKv`ŸFZkXS aҤI7mdm.ٽ~Oe}65ܜ1c.Eq߶#7S6ad2ե[*nq]3qҀc`\bҥVpb#VfVy:3ϛ+C \(wp)BSϘѠZYJbNW)4Tk%imʌfRf\qO [}פ-DKĶm}>Hq~tWtݶn·TqBT]/|\[D]$y<.#ư J-m4; DZ! ^x.y]̺8ђ*B@ϰ Lqx77|B&'D)B%J^ҒqMK?[XU3+,Zq!)8i 4)PZYd0PkYG6u䉐ϔdm(* 9_i6Ggɟl{$ C!  V*pRkbAFКFl 1VYEDleF͝QF'F刊`) NP*#lx7&3Zh0ίV{\=)FFh5G,$ Qq37A\Iվi5ӻn\1hӍ?tU=+1"zo*оU]$h'SNԣ՝V=2\I\M]eHǁ;IvpuO W +ՂH\,d\fLxfŷzCPbF@}-QN-Oo^:D7A3y5GP3ʅt4G0 -^S0ªL'i7c<AJx @L=+7pĥx_*IĮU9T1?J0ް$VW gWLH-KzUW}+Wp K0#J\%q( - ]$%-\=C@`Ԟr_*IKw^LRr3+)5Cd* ,J \|k*Iٲg WJig0 +خx8\%)i WҡqF@`c[Sp]]or+/ɪ⇁A}C ~%E&ϩ<֮ݲFj{KɪS9u+E+jW>껡+zk݁j?B_L~(-[]tEGz[iŭ7ĺ7Ƽa:$!NߣY+ih54 ΄дӴ #MvIM++y5t_ ])C=#]@ҏ\:"RѬzL0вK+EHW/aEtc\ ]R_;OWt"*ǿ9=Nҙ\V.~mYq!FGIq'mߟ!>yoN8/ߺky{kN >@o5 /f{Kј>m̽S\wh 9yv?ΫKtwZ‡H.J{ol,0~̆MOBnliL"?r~ziX7%sFdԾ왹l<+½ 'WR理QpN~rgRʣU3h8٧asFݛ8W) rjAl-݆ɴM (5F'r~WdC:t(bD KGWDW̴R3ѕtRIt*-nX E<(1Gzt nMJ'hW-JQjCϯ8]8^p=~h݁~(ia[itGz[ѕ+Z hӕtHW/\~mp oXYL'v;g7.M0Vή-ۥӴ> irm}.])WCW 7ZK+EM.tM芝&ucnX ])$K+樮^"]K'>ѽ಍k+EAېҕ'-;G])C/m\ܕLHW/ycdpЕMYJQHW]OW xES 7 7RǩIW]kZojJլ7SNWCWȡ׆Q^z}?j?_;jޘ}H+9S^?vv/=e,evD^sr{'?=;nm]“_?|ݛ鬿CP.n/UY[@(UIgd.~gg5r?6^߽h;n{}|]ewagM]=A[x{\Oo{ĝ#D1?G/n`>Iݟ\niz?}w .jO"Sw}rۚBHmͻgUR !/@$Q>1O薉<\f5xs>+' _+;4>GC=Yi/7~m6/Ŀ)gqĚѭԲ%'S-m2|W:ηO7w~DŽ ګ5~ݗ ڜVc%qLIYȵsEl6BB17IQ7\Ol%$Wra;JB+͠9qr)Fn.lG08Vb;ia5}0 cM-K`0ņb{hsftD)Z"9цsPM,ѦFJduhFKaXQrPD\/޽{I5KcV02 l I܍)!'Zpϭ?"yA2cs"1T7T(:\Ө%gC ^_{/DE0sb[Dm hJcKm .5ĎM!:Ζ20T`L!KhdPnc0j!shj0>4<"T i`qx/H2Hկ,&bU!tTo:DyKR8U _Ct*ܽOIV`z]5D,xtmhA)=Hn$[3 π^G)ɇ5Yݜ|O"v͝8Ǟ昑((E?CSƀjkW9_*X %%$ڙF/*HoI D@K$XfjȦmE3nBN# -Uk9<&Ht=k%ġ:D{|Q]Zu;a:[gr@,-MVFaL9'X@TTPtC[BZ y94 H@Q7f=ak בC *UD@5Jhջޠ8*ydWX]baN(uH#C]7GZc. W c" +lBGkl}ĕlӜ`AV-Q껱*PMl F]F1k)B@FWSO J+X/1j3|W4aq2M!lV"( $( "*ZJY3p@x`?k;&q b`LΡ֤N u& 4XPVDiX!\ :7XaSѝED] QQ& ڳl;!JQ` nF8u ip$[w Jq&NR!YUDI)bmgH(`+=duDrY['% p` 6Cʒ 1P #oe%{µ8b!z_4yZgF$}! _nݡbA\fΘ N(Nͧ1*LUitFI2!`FV nfDK@Kel8y/^n`|>>8;LXwAH7({ 6#hl6TqM-2QG[su5dݨ<[4zex31+3@ys7ӣ! mgid:DHSiCJHxYTAִQk3lf1j!#Z+X]SN':?TmNf}&M-,pB:ZAi"Y5]-35X [̀z122;}.dO 4%尀LJKF$;t5jzJGCX2:a$فnHێ'…0ilR \UoKcRH0r:@5[T\GR! . V3P5KW !B`E%荡 R Dž^oZrqqsvo ,mmnٙmWLw7̽L6 ndQW= Bے,[r,Fiݘ""l TPS##)bTW8xOѦɨB/PDbظM]R:7KfI1ScTtL%0'W>/c12z˲@IɨꝝQN =NY,VXN'(bIA.2UP^Yk_S7!l0I.u:$u-cNxMUўDs%%6;J A QQZ^ (@%1*`*{hT@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*]%TDd]R`ú(<3:F%Pf8b @B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P JoW KJ*Cg)3J @K+%:F%DF@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P tJ ]RiY;J @'DA (@%1*@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P JQNNT]rfSMcN%BUwr: 1e:NDwK1HuFha$ AtC$̢[SY7ٻd /3SNK^53ISCc0$O~ Oq9inOPՃiDSIфWZtcRSa:t#:KG!K!Ks"]`Xv"\*BW-ӇNWeKtu5Jiͬ &vJx/E/d z 0Jd@3}nA :wNzkH*H0iY'g$_Mqrk_JNI̅(&?fvSZJ0_YJE*zt w2_p>Iv2YJA}Z_-4LARXZ橰&KE2wLp3̈́ԺK!DteʃO(3|qKl\1 L70~O.`͉EM'q3+Ӌ0Resۍۓ7z*8hm|F3ۓM^v/~߻WBjK~4G hht9I`B}vPs[/)7InUd'O wm"œjRmE'5vT[˞L%vh]6Vde88ev?@Ya|џDT2h=kaeirODu򝟱wh䗆:ZUw0oXPfm>yf<됯_uNZDKW(F_}5p8+,hw*I6hUDi1ҕ$v)$ִ3tv'7*"JF[v>,^)OLW˞X(u?rRwAW]O98 2,%"a'YS3&Qx(ML4Uu#>bw(5G>FʴޯU ?a:>q>K;>gWg7ߎold>aN4J1jIzMy`B;ք,MC>zq|vNR1vv5ldY@|[;2br#ynH! g\rZd VjEa@LQj8NMpQݽBajQE#43P[vݭsH]Kx!vbm` +pSUEyF ۇ|vP)ʳuj"Gx@sxdaP2GV7yoO/wӾ_해™,eZTXE+g9/u˒_1b.^',}ܯWA]Z5rC(u.amfӜBE17&H^V?Ӱ&KV5O^CQ[#zQӲwv"cZZpRjS!349͘'L3EE 3\ EګY1дBYF8%)N2.si76xRHBS ,SdBHLj95ZkW/'i}~n5w~k."!D'ZsSLU ֹy'݆ZvO.pӱdvuY $E'/?)ߛB9;D/S#||KZ*%hXrv̋%/dTEQN =nOqSN'(bAA.|%%1 u͒$9jXVS)TORkTFYgkK#K>s" 0Vy|ӒI띩qtn)@dp>[ {-=~Sz7aQ21Vjt [)˔'y^L(4e%s0 JQ\P>pera> 9sE&螣d{sw Q{A]e=,\vUiDwPS8joi18$1_(:W?ZnVƱ™v.sA,h+E步9U4&YQpfq+^ rǥf1†>\(5smȭ:\ 3( ꕡ7ċ={qApHˁ&efa.!,{SdVڂ+ AWH:"P_5 _޲P)t: -VKCXt.T)c\"yD3 k Â*y:,WaqטcǘL{h=\s)Ī@%eLYւǼ"Рr)cq0>Tk *`Ej)))t9XGmZވ.Cv{j6(ۊ >Q8Pe\KkoC|w8M;,YڇqmJ?GUv"5Mx1 !D皚|.(Ͻyn4+F"[#L,sgop KdKU/i\XzHCr@H7+GWW=t>O9=`>j˴Uc7e| ?3~F3/tal}x/2I*TxS+,wp>hLo z{xyYwɬYĒ`&G7¢Nw}^0j׹4# F0 $J~x?GQ۬9b[(Ia|Sp_>Ut~?N}IHi름z5'|SasUY/)X磫SxeiYYtu?ߢE7.&"ӻk{xuKزXZۤ϶ZUHƥc5X"*ק])]?ޝ6wo.sXQ؃~5Lgֺ}V˛.|&aۢ֝a\ʛ9^66]jɒͣw<ۂї>rNZo-Ls^ۆV spivG-wD gXF((GZrvPOJd$ TK( ~%if2ᄤ$< ¬ǯ[i)3gda#A XEY9EȔw. y ChCD??)߱?O?tEI6d!(TV䔁D3 )l!,,(C,'uAadQ_¸ ƿ`.b\rd.}nuopøVM[7N;Ͻf'du[Q|fpa ;^Y\;Yͼ0g-ͥ3 P-\ ( {ʹ*rQC}k-K$}$i NPO$Ÿv!XLwO3 #2#"$e[k!R)JJ]m-UeU/_DFFD} ] lBÍεL Us]ոJ5,bƒbj0㓛Eczz9tßh`0/Y6*0z.QKCH d)„P6 ɹ)ZL&̊Cq!) lJc#iZfY(79̢u?v!Gڵc_Vjw !+`mI 2i^6J85Yb 2<;lsY 18K!WEG,a &F%!HE `"9Tj܎S_vzL 0O_K?ED]U"xs9/%A*hruQD2#) UED8c*Vl,hgDI[\I %`"ڻs;"?KttLjZ/.ʸh:\pq}rH~f #ٺ' @9sd!pXv_fƃamuT#F) Hy:m+}yJ[޶Ǔ,K;h֔ڰSћB*GF=Ţ'/7yXvhm %DB(VRY &sQS";d&6,+sǼId򈒴޿,m܍C&8g%rnxm͛ZA<\4/&33erziƇ%6Rعs4QٔrRh2$)A6 \Q C8u-J_}X$B}Վݲxn>f|>}ҦUo#Vrw|mgilw[P#ڜ3~O[b X/IYY"րgqm LT_FeGӖuE(UAqs JNcFe3B!dB1`LYR3V)"(=r0.I|H.xƼ(D |R;ﰚ8wSHzLM43IeZ1{ԑ>Rշ\j˛oT&Ϫ/O*jy@%xij 4 !kQ'R"Ip&Z$pàjx|yIB4 dE&Y%JR`HQ'GfCVX&fUúV2H+H8)rN:Jׯdg%Gd#H cB lgĹv_:zKQѴ BT:%t9 Lh$$ 9OMR`-8pAJ &Qxzϰvajq? M.vNfw%wKB(>1@'H8G>@?Hb! VD9=y .)Ţ 2cΓ1c˜gkE-j7j&H0XEX@Y&s,ֹTbg, cX\. :y892G?Q9G~st6EYɝ y2fpez_/+rxtm #F')ȱ-=5#<)0ǝz`nRȤ)?~.h,WI Kn<8MH`VNR+_rCrޒ{8ؽYTjqGc}645Z ҁ/[zd3Tx!l@瑩3!Y%G3.u(jl%: S'EQ,|+qK{CeBmqRߟjҢOԮ['ǎ=;{v0v}x3GL*"H4oǷ ě4k>5V.%Q mX# o#anJRcFaFӦEy+{ߟ.턵P?oTUnW]Vf<Z{JGcGMжQma j,Ҭ<Cꕜ#4h\1ӌ 6ܢ ]{= ÷xLs۾۷tb9zo &\+Z j-lkm \Nz$EUmj+N4ƒ&qxX₼}P{7yoދ]684:T G =+;$O!3ЁA\fL =Geh<0h'kyBT_y%'1e킴VR~`HrG>V~[rbؖr]>ӪݙV%_ë$r !1$QtoW/dR)+8Q+yTz_.^=YÇd?틟+NmqkX ~N2Ҳ =1(Sjbzsy?X w*4ܨҭ$]]d4`u,~a?~7==?bM_db4ys?ĕ\WwZ]z_{?gӵwS~kwEioZ\.=]/fĩ{{ߗC~OAQ%Dŧꋬ MO7Ԧ"ךCq[_C#hlbvh)B{~x|/\ Zޖ}7okݽّ<,<|} 6 Z' _r~my\]#5jӳF7$9Q0Y^-]c e*_ǃ<; v=xD GA4l8垛cM 0-fo~1"aT]֚nWoX f4f;3(.w|2P;u8ka"DI7.5Xy5)O5yLxRHe.izK#IZtsu#G2_^>RE _y(1D6Y(S\)D6)2s1s渏wW9Qtc xrԾ 225G I<ۇHA}-Ogv{t "SFFE5 m>lm~'vЪRmI方,#e!jw2,@y_vq@@=uT)#֥>ȝN x@%o2/|@JqKhdLqstdxI־ϵkwXrh^(uwX=qݤű>_V_].|>+ EG,o \Uy ]y2WtvQ?hry/Ո/%5b'.|I+a]r]XiVYdI|JGT쬝7Y[ft=_8:Жe!-ҹ,t1t8n;?cy6sWUhkaC^WC/Xo~ř_o,_loܞkfto_؛]L\^&SV'/z*s2zG<->kW_j6ŀNj.+ X ^6w!F׳F En#Zμ4e+}u6iJkZd'\J_$HٱU-G7iTIzޅ nsĄB/BG.h:wk) 9gY{8v+B< C6H&E\Øgc[r-{pik׶H@xw̐G(gbtZl99o8=wZevN*vqy%Nt_x?1ZP\!zQy$8F䨙6nr}0d  @0?gZ "xu|I"Nm57mKrK6i5`Ѝ4op4i[WI˂>:+%З?w{n0]R/ۏӿ[ϣ&*h(-E!Ū"KgRYkgH\>qLGQ>-htyEks]}b+Ԝ(0>73+|f#h:ͦ4 D  jHgm`x+d`Ic*墆Z>Ye O= .rlPl6V-au\i!$ytʫKQ}ʗ* PZ!{ƟP2*S璤AL1Lr_Tk]^[XIw U q0ln{Go~z}zw#G߽}{ZJ 6QI p /vMA QO&p%r-p s,"r1jꃗLJx]kYeNhYx߸}ȅIEfei2r2%A:#9尳^0.U=ma/ȠIYb@F# BG I1BڅN؎vV.ʫ՟;IY`2L I9(yR%+w M׊8ZГxgYdt,p> lWM$/Fv M+̄e{g/H?'ɗ3jC0daw`Pz+dMfE)VYv NM'x&P-yߴ08lgVjKy +۝}: ?}C1eSU7&2#V܁#sB#T 0W\r{]77:k:\Y&&uri i҃]R5)Z)TYH6 I'" ,-*\]ф^G\Oif OEz_ӰwIϕ-t'v{o$ײ{ӕٶ6< [|k)MqK+V:U 1j;{fH6?EulAܯwS w!{4ɴ-vwkz$.we)L[ NqE\z9*W>\9'E]d;= /O WoPyZiV ùBYX#/{ bfFA Uȍr:u ) !]֥Ȳ%ڌri2p4&d\+D:IBWK'Fn\u/d8 _ve_!S1_I$=t[хcb&YTލ27_w~ro3TU+Aep yfQ۸S?w禪ɓ&osʃF*- +I'm2bXY' T|NQ`+za fhcMN\%1Uus?Գ||1]涔ٽ寝xW6EmQaW,W,VmQBX 5 K /I K Mz&7,)XkX 6,iM6[# |k+l, wWJsc!lqUkE\lkW5v'Aq5Z[$ ¨F\tqU|c`ip'AqefLl*qUЕz[UVo1X܉oR\Ym[$a`kUAw{zltqU;c#]]=aQzJqXrY>B\ٝZwA0e}'/:-MH,nA͂cJO3~ty4l̿}T \!2866g8%1]U-b`O^݉oHL[W?x4);wBtFdt㼜q@fWzdЂJIm*VFB Gć7_c}zwOmSGx= G?aԬ1M?ߑ!#$Zpb=t'wyHyMQ%,R'L$Kw'gBBimux4n(W{;i- PWZ0h_9LU*9zWp61Z3\kkIL3xD LVsp$$M8 ˀ)m)d F:BDkuL*а>CYQd?D83t~MwYe`H cYkqNRS`k܏u^:,iw94]Јshy1K+Y>2Y*)/:kC Q|E/0>)kY,Ҧ"W>ǬP.<ڊ >?.bnԻu_IyyzV*Ӣ weVZj)0!&S{_n2tAǑU/P998F˕1[ i;7J5COQ*z]jI6lӞ_+Km)V);L.BtW&rcŽ};0`|B@ < ы#1"GLAQ9{핆JZۚ]GS-Zr#Ci(%tꬂ>f <%ϓ@Oq&H޵VrK h.֯|P0A4u٫ڊGMn Mח[)M/V:yge&ʴ)jP L5bRrk*sm|BNƤjggCB*RڐH6@X$`4&H!)! hTWˏ5Ag}&ʦԷnk3CyJ.%ٵYhDsINbbޑzvR[CIB,1rRF I`ҩ%1m5h Y,hxz=lpyc+T:dԶj3TOS)%ϣP^}5NQ}ʗ* PZ!{ƟPZ]_K\sI&9ՋӯhFgT0yEﷰ "J`11$"mۏG?>{C.)&m(OA_>< 4eL69[] 7z7Z~X 39Ү/X-m9jNhYx ߸}ȅIEf2e ]e>JT6j-X;饝 SR&:^ ˃5tdPʤ,1Z k# CU!xHD" ɥ3Nt-X?Cg4g`2B_[=9ٷ)';t41BڅNjP9Ӱ$bbFG&.FDd2gp ɶ6p!%p¥N.g&Mw<!8+dgZ{8_!iXj&16 ._e"yIʶC$RԔ(y[fz=38US}eK sd!] :b=c<'h1sCJ{) Rm:_ ]Ia*K`Qd)j"h4rV/)>eoPIT&$B*=#c)#9H`(KB}#eUeZC!xS$l2;_59Iǝ %+=%(1g4eFCh_hn5^q!7(@g+B@ޘl*x/SvWV Nuh5&W MtSer仃! UAM!?L\. *Abp.xHH?9%HʠtnNj~@4nF̱x gB2td+c>`5ح lu:z tB27tݐn1 H˳sr /A 6RFyg#%lscB)! #/[sdmpcCYWz _{h_k#C¥*Gz|>[ WƇ+zaoFD!6} cځ#O0*k0%:Osk-5nנ5G.;i=Ғ=nm4aE X*vm,5}mk=0i>ˡ?[ںg;Ԉe:gӀU 2],/kr!R+K%e_0 A:l4L@t +{fVHxІ夌܂U_썃߭Szo1臫#)x,k_C÷}ޯXO'wY`-uYJG(@NLE Sa<E؋E(|M3 a?E,h,!Ac$t:۲?9/h`]BibJfOmxK 4)F{Az \QT6 զsڊd9l*<ҏSBH|tW4biom_㢆\cS*'OXCcRBȧX 1J^E`2Аygagxd>ug,ϓ<[L"$0@h^Z%pyN#*2Y B:S % d8rh쉛qp*Υ]CfA Øƃ"F+43_ooθsڬ:cm~^JB׊j}JR0_jMp覿KD7hյSyOџnv@^ +%?~HmrzUs9M@W}vׂ^?q"$tu/ msway!$toa&(+YQ;-GO?m?<틟skD펷kX˵X>xYwE,^Ѽ]P/Oﯾ_}4Th(iR1F?|'~Y?\Q9ɻ9v_y:rix~MCt'ylzllQ!d<;֮MxJϧ3Խ77rod<(B<$RK?8;^o˯<=\ks]noZ6?~IEdsLFM ,io{B]^,놾nZ.iFdZJE×y_^t[ayIynFw4(BŢz#K3}cST.Ǎ|),3Zga5+] xWG{R.ܠŨ{U~hL.Xt.&0mOQ?ED S|M⬥$jMmB;wu/8L<@jp"+ AB,H:Yn䅛͋,<˹:i#WT\‚o6k0f(AEF+YDk/3. :FH`wAu&$씕 ;BU:`"Sy @%2Oh#6Brk,AC9rkq_º쌺 Fe/ o gҧtew^ʳd01W^ɜr(P*GDi=Bfu((]pG6q2kixmW|rHk*>|JavQxJm0wRr䨘FN2+%eY%)o]C;t^\M>uK<>7ߴ1xn]oɸ7 /#{q5Rvͻh M kAEJ@u!#,X%= bAR5I2)Ń%U"a" chX CA|Yv>P w}|QiCa=^ܶulwAzI]],Mjޡ㰔{iTb (,@@N:'"QqDDk%j.sRA W tI+nΡ$=BZA2Gg%Zs!Q:CCPqNN(Xb"K5I6Ӂd DZʣӦxwMaxqy^| :<=E~Ѱ31W"LFKJqFҟh3a~vg \X0q a,m9f Yi2xm8Ȏy,O|K 6]6m .,n/Փo"DOgD1F L29ZV^hr"$B];'S{M[~7HOOFMXɽO/t6]~ĝmvm^23$ͬ|KK|OTkYC={]ІPKѾ4t1twm;io<_&/hyҁkL:oj~نɏ{~iW,`{{ tt2΍1}A_I+e=zW ]q4(=v0}$<-=$_VE5f.w\69!k(Xl͎'c,'ˉ>r"v@DR9Qgo'S["[ŖNG y 8JcFE)r|!I,֭pb6hJS8sLƗ<YBsBXKX>t- =xۗ57/}[KD!4iqCKv oXWdˮO$n'*Nqu5ז//~=u571:*u&,0҃;^j 7w[Ms<1 ('"-s2I2JoHPrzE,-' B} &'A X&P '+c䣣@ p4 cB;YYϪMz՗'fK"ZR݃.DtJJ2:dq)'odA"0g33r;#M"LJ -gPF3af6W~s/QK&n_W/!FkC- H 0Z.H-(r˂6jc\h,Nxe{d'䌾Q&/"hyw%)[T]Iʖ)@W=O(\PBM vudRv0tS| ruҺ@P0&0 4o72Z$>'kSvԚ85N|<ٚ=x&))hba2fWMKԖQ$#9 M02Kr,ҟLѿ1 Qj -Ĝ $dUi'AJM9<$8vLWk:7$ AޕFn2o1 'd""vjʁ{9(FsvE6OwW=]5="%MJeM0`f]ڤR6d #,7 /I(\(B]YKT8d(3oL_|@&˖0#vʽ .'U)vڥc똠D(WIoDBtD$gs`DZ(H4i\Xtlfkyګook,ˁ/(ܾ*<ڏ ,5D H  ̢#!۰YjmK4O|D牑DUr>$rjͮ&%33xCmJ }TR%&Vgs.{g+DHcp =`o}ZYO:H+ޒ,rL]U`M^D9ejk~kqջcL"䒐a<1~yݲ A[`zL0Iyy?ga=AQUY*{S"FpQǫ$ mƼ`*C?Oz V}Pߎ]~ƌσUTq[7aH S9smej({ci MƒUyܢ:y>D%IW+MpUw`ܙ=OI184&,:Sݳ $vrŝ.\6v׎ocvK܍X5(>jN;-BCJwz?QMkޑ*͎]EF^6u`5٥&||[:ԡ[pX6!Ƕ;6.t~,]Tsm!2|ϖ֫GT Mĩk~`i#毋uP[9\} HPCӯJL'^is2>ݨ (Xb4_u pQO BOPkpfU"IWϰ )< %%T@KWJMPZ$gmRDO p/=6/6!lvoj1- \RsVSUe.@)KS ID7# -#3ld̆ÿ%X! JeZVp - fiqݩ O,e͎_TFYy>a1Ip jKa4wQ8et" Kٰ X}{bI)I#uֳR49 FR)\js lƸ18 >?yυOʅw,hlS$Xj3yrj(j4Ŧs@#yT%kTo$@ /:kb"c͛p ,11Ic4g?N= cBblOߔ|ʈaF=#+,5JP-4"-V&-cdA2C)t"yRh |$6+MyԂe0HI%E5O&%NSMn ~F*P{zR:SyQ5̋Ş<#RP~s2[|4#j>g*<9>lL::·O@a ngjCݏ/Tع_ga8Z޴hFHw !+WE ol~#)x6BcO$,oR@dS*d€[K ٘,'$OLP9mv#$ ML E•>`Z.ʔ!>3(t$b9H)舧S)R e' a%n ~%X]8Z x2vl<ܵ?n=g7;oyUUaR:-B1Ӎ: v/ f0Et`WKJAʟoΪXٷpon2CW dv8<9o',=!ؿ~8V=-o'yJMjiDdOR9 濥3vvYg_6W\ʆ h#-3Jv{i]_qwjQɛnbm L%7#lu9T>\T_+ /XZCgx3 3-`?܆$84VAmP^c|yC/5lop9)}0w'7SSx8CGU]8`2N,gh|2JWdjݵ0 x?gj7԰8t0/vi,%*VF%f09b1%kdl\E^<|#߲bAht Vdkj*xs5o\4x_dX!8R\زL &܌I"Q -]wmu2-'Oc,u}dQЛ#3 쒛sL'7iXX+Ftmgi@iY4Ct ]!\BW7*_#]j;DWCuhi;]Jի+I1Ct+k:ARAH%0FRT ;DWX2B3t(kvOW4Ԙft =m+D{g5ҕa.YWXwDFwm3(yˢtv U{}Ãw0}h* Bol&x9}uݝ6Yk=?->z%{XƍUw+Juyߦ0ˋ $9y婲F*iVV,#!f5xu;kMY[ѳYDn.&aZ܈sR?ϯϣR뵐Yae傩68s%gx&q.{L/ ǥVd8v5y ?nrB44YNwη/-]`l/b^]wnq:t6Yec (Vӯu;]X*._.)U5#܉ a;Ms/~c?ƣO1,gMg6z8w~3p?ީ`:;@D !`A8F`:y&I`h>e-φ5R%KJ.qL28ll399)`dP܁REְ7gkg}zJb/0^dxZZM ح²Š\#>2O#渚[pgWSmMyŽWo )K֔/ Qq\8piw{S܏gݲ'8{ JSLIfȚn$`1>XTP5)H# ,Qk$80) 7T{*Eu5lz5g8y^~#nwm͍=gKRu*yKlR AuV#,VlY^*e[oW&blNxܞ~{ȁ : Q;=v7og:S[?{":O!:BgnŗŧŸݪ/T߶j6m2/m]=oiTPJ663Y\P #R>"p@ &F:9L㹗돳mZs&jQ`:[Vw?/V,s|*e[eY\ȸd~5[2u]\<|\{ n{_?/ۿ͚^ܩq 1qi_HhC:ˁF Bm =OI{-\[;lLJǻw*_w{ٶ+^ i?I E~\*ᠡY!USR9$VD3M2r vZClI-`O`Z^*8hy5:!@{ຆMªZ|1VW:Dʐ r)yyejp`7qC<;Ij;(CCK~ E2eR@&y~>&Jx/[2'c&>k]* Zdz!'YaU, I-1W~)Fv*ጹ&\4;~_gZF<ݻN aPϣ4v:J6qSޤgQI'y0|6xW=ikjq8_|Oan!=tU{rY%{ ߳/%o_:K޾XI`Ilm˸EBi t"SJ ||+15`v/&p5EP2;j9*5P)1RQk/9> ]Eʽ;!v0l>&әV,S֗Y-;3Kxv}\;ۙ2[FӮ~m1};txu4.;w[&:9zHZ2ɲSfW-" .{ktMr7g}($VWѰAU!'RX{W:٬z)ƃ}œdaQ}+]QG-飒5:j*Ep.(vM#lށ}ᗰcL2-wMV#"d#wXAGo-H)u;}M{[G}~e |9ڤuQ U3RH KPHyZݠVۉ'.~{0l1;paɨb*:!BtZyI%#jrB`r-ʗi?XOWOt;k|xP+I|gmXwLp|OQX|P A {|&:,2W(!HgOS,cDDdt]xWJ LՇR* ɉ~yM))7%%QJJRRvԾrQ nk] S=jBJmnΩjd~7k fUtT@g5K5ޜ8q&j9[cB萳%hW`vL^ 1mƛ(zpUeS\ &"@A/w M 1c@!cc)g}Q1t9{2YXՠȈ.d-QM-r@c8VaTs)6jtZ,:ؔPyᮚdB{o46X%2JSоUY VA_jԅIdO)ڃt^RlWb |^6f~rCЁc*%!_.S(HpfQ@ra3[%׹"mk?u١@.b{AS8u^4k>~CWᾠ7FP8 u*3O]Q=i!ix.e%e Jdj`ˮ >,_O}#"Ȥ٪ )@,I,R(RW}S&M%wtȶ{'.&F֤*@]@OPTTϚg͢~wG.4Zq>ƶc{foů#M7pI7놹m?:q7knݦ~$1f)F!r7/o:'BMΏ p@O7>fz OvWĿi~$[VMsgrWÉPqG\l1ky4r߫^oU?b;ّuuS>6rTǧwYGNe;/&X!Z鎄yFkOf#-fkclAFF˹Xn WsPF2rwy~o1p\ost,yqDžx&^/t[ݕy᭳vSo&Ǽdϗ-"mii>.+V|up A!b2:0)QO(.r1gkT`kVŢK6bs+FP,J*JO;\9mfN!ov|JqrTZZBWQzWhKPW`I +գQ6U՘o\HgƚDd9yBl9q۸9TuӓL哷66j+cˎdV4Tds &0b fO䂯 +FXK߶ mz暒 d\ hq=%f*jEu3`8ޞ8GT t*|a78v7f <͹Ԑ6GW_Wf}ϗ_{ت:L) &[&Cmd!&RjEc]glߚ1LE keRF7i"C:TUc,]ͥC{sc傷(x^qkO%aD`ňkæXhEP"ǔ1Q9:y?y)Xh@A¢)^| [GQ%ux륛8H74x6~mgG<>w"%EcdmYg7n(@K־(t1~bu {Ե$9`qŁ7%,LZ ,!uʽ+s#~,r&sbݴT:E7/bhʅ')Rj6G[bB5dmSʓ_|x.vq .czx %Mُa>عKwG?j!ј\l)n>AB",NS3%=yœ{ޤG]:y&V_f7#_.~fym ߗ7wj؅w^6-]I LToҁM6V U#Ū!ٶsk}W֠;vMȨ=ڊE1ףA*8B\V#YC7qLN_E:Onc>K_>^lYcAχ۽RG8B~8BIŮr`c+*HVs 吂cBUv%`y<³x;~@:>XUאV |\Qc&]:/d`_苈Uk#Dh+i g+!P֝C7q;E4y=BvK?#m2p7bsR;gy#޹NqqQTΔY%GdךgQ`B#FX',o.yr=;]yޝUr,H+c4o|FaU]io9+ l}0 ik"KZ}%[[DRd,ŧ0;y,rjA,9I IV$OR׍%?kډE%ǓՂbȭAoKl9IS`5J >rAXbfVB/'jdή8$e.mu6gْԆO~hq2 u8:drr+¬Qk5 aD D0HinXh'/ Mkmf.?K!! }s ym֥2o~Q 2:a@[^˦Ym6Mڣ~}C_C(Q7&ПS4TA*0)o3EȺ2ħCozl&N\$R&tN xp ~,>[X#+*aUwWՏ- JXMJV%iuRO+*!X  tR$$;e~PRvQ=%;z _E aܳ))vo:i.t&ΒKl $ ,IKz:NS7/'HVN3*|9GeAM_}?nЛ>vW_+>}\4Oos S'rr9ҟiz)goqљuxqb8zkEsyLs|g~N=DKBI/ -Akܬ܇7.#,oQ.*?-`0|9^VL s_Kwθc P"7LBmd֜XA+៯pbR@(wj/c/r{#{kS_F.\5\ FW#Xnwo^>Zfm>J,*Sy_^44*< xg|9MF ;z&<roJݚ畛I7V Z'gˣ&o?\9BQꦠuΝǒyu\ &r͍*ViADbĆ#1;ئV$27keEb$6{BN` ܱo"J ; ZDl 209A ]kUN'r806L{}owA9R޶cOvmƦ Lic)1;k9JڌMpPbeA [JbsHS٩Q[QQoG[g)7⩔]ChygO_h}.#<־u΢l:}\Zi=2aDD&p ,jnDh:vF6x$yyiۅoShw�+[7aOskn{ccΘePKv-7`6-1[fb{kѲlkz<Ĵ]Q7X*Yⵌin}D+G<$`IRa̓[7zfj=HXG"D I8vKRʃp-ay㘢NP9+ iErLoNt xbp9ra9ܾ;/*M$"!*)?R`a+\¢3)r%+J5GCl(J,$1#\9PFtDVP`sC3U|rDu7DwD@4:&d̰4$fD00<[A}cPY N {Y)t9l0v- iԫ(sqEa< F? W}4t3b ;0͹gN߅&,5HΈAHZH-Ou *i݋1–J6\V벶JZ$zhfH<TtB, Au_Xv`wD Rzzy߿.jN~~J*'$0ȩ7h`U`v\]DIտoP18cn&@D:zU|x})& e,UH '@B9U붪b+T] W6ޟիz݅/E"'ibs-ʍJxo *j޾>%e#͆⿸$PH8+DFpY1f!'<@E.y9{-#pAɃTƢ` jH"A=Vt 5X Cׄ!x:w@څ4Lnst *ʾ~6ҊRPOlmR »"ˆOL[lw?(FzL\qjUv5)\7۝Hӄ'%d~In:3$*e~ f~3JW'Ar9z2imWJvWdš'SA壋,[\ݏ\%W\݋Z㈫QL"Wd'zL ScpƂS.NyEyɎfTTF|͠RN;0#/Y6G=&X~et800J5&ӷo7OVOHL'NjT4PKv1z'b H J(RЌĖp$x/^N*hsL}J <+)Lbre  D #yoS~@4 ؋'F?껷0~Se{ =f~![ 44cwQQx㠄&c f :£"0^RBZ؅.?ea)z?po]pMM>r+)hz9E+}B@0bcd9s'D-cB6ly_uYeU[W{X@ 5<:8I䙐,=<(K "ƍ ,F<UX-zg՘rU)chnk[E6ElI\"I$Sk٨b RkD(Eh4[h"w,$H+% Z$x^,`D ZdT4jCvY$PN"-VJt="Fe\2!nߔgLWrx5>Yma`]Kh6|tV0%%zP4JV0x`FN7RΔ0M:d,2 fK}xHqb'8cM*Ffd j0DRmKȹ[2g+ i˲?{ƍ~UW{]r[y|ٜ+@))J4dIŖ=Ưt EٖMb57%^Հ&x|eB9K$$y:P"*s)RhU5t2+Zb &CR)DQhGɶcg|*b)i]e];̦ebFks`>8X[c`$BLx=\sOج1HYbsv J* rEVtɒĈ$\4S$ `"Tև9[Tc*76}-8W#Q qЈyE˩T*hruQD2#) *3f"8i%rV\ T2p+,iA,WQXFW5rNuUҋ>:qɹzT֋fЋ^<ڢ@41, HI}io5jB 2gCžjM7elȣ'q HUEBܐ(5_|bj'D?J'^إWX呬ibsi..QL5YhЀ4h+1a|@Hd!Z A"᮪fud#(" .KY9˘ݎӷ2ZӐPfu5rN(engqR}:.=C+p10Ɂf J2kRDK]0E;rY2EuAvdОaRYzkJ )u2H4z3] 9',~ AI鏤hHRx%EtJJـ,KS>fMVcVUlҗoq(}I~ۇPi4a՚cf$ u!r!rnSZʹL9l^$[^~Qy\,n_2&rS\8KA'f!1 5+c%p p@=qEɒ僌HƮR,}E~[־dcV98 N/ 7 b?Tp9!4To<7! Ce}qhIK"N!Y`* `)B١Fy6$rBi$ 3|%, )!Zc^ڨZri l",/|)sǬO !vjML+F*ף2)|LN9|[o%jdqT9}H_H^/ooRk6RG(hyMcdc;=KBt~cmJ`&[t9n̈́7nM` !|yn}٣_[һT>W N0S"ks(Jt@ ҃ $thn9J!ܞrvˋrr ˄{݂]/|"/2m$? HoVW8oU։S+|KA3`nP̚SY:\,f5őI7c'\vBuSӧ=|;]$ 3o<WفWbn?&?2j*E=ӆ5׏z{kOպele]NuRg G9EZӶ~)OLiDZդg-nvs'QgH)C榔+Q(58v9*xڈ]mM `1z?^^бëѯ;is+y!;Y +>yhP"Oi 2 AzVJ HBf¡YʕR{t)x֭,š-O<ԣsc2x,OknOO| )h2:[@ [:Ate.fd,mr |h0r 7Q :2bNH[uF2Yy6L,-'|AB4lQ4 @hZ^!xq™Mnl#ST/xj˾ⱆ`LPb,GL+4%q; h8rgȠ!ioσ⩣xfW T^3hgy<,$$2@h^,$CqIhYe-Yd$~L ߿w7GO^ FrN䈷-JG)gZ9kd 2=Zʞ}+w= #@Qmn>Jx/ ]~xW5EEB% .G)D Z L)-]G+C?{q,yCbA ØNʹʆ}^zoJHȫ(5@}V{o"Km LOK7^_ @Z mv6ۺߌ.~ghAIlMo Y-!<׋4/r6u7[qBVyK૏nN!uJL)gSk)aUcUݰ=!Ve;1VY|U˟#^ATB:c *uz +Tףk#]aOmՂ{I>m$i_ۿWem+C[lN&NҦ,-j3;?zoF7cIsBm̍ChT<;V'w{߮fYM6/wV|.abYq%67/ߎa{~;;g立ϴVk=ɿu\z7߿ 9}w9_m\ޔ^/ S~z;?sO ӯoA,.Ǔ̓l$-^a\[??(lu5IDjv=hM@=?|ؐE8~ǤZޔߏ>7xf}rjxå2y({kT,]<0 Ѹ\{$!k_uR>0I]:v|B^f2 o7ct(͉)>w:HLBD s9ҭz+7/o< Ag@adm2ER Jig2u%[/y " pY૎ |ѳXb}*xU8 OC4$#HBQx.9U +咈&@fn1fQAQzc~r-8٤Y~F˂ ׏g(\s^Gq(Q=ǹUߌ˫ ck#"SFFE5 87?Pzқ=TEgnK2iY.p'+/<P#uϢgUʈu! lcKZTp!/g It,:)9Rx] 5Qc }GO P3D|GG+ |Er084ZZ|uXn.c7T5|93C!(krx$C;O`&a2o-I*yl$F:.BbA(2aG:!3X?׉ \>bK~<*D{}]ubtt#r:)>?;t9GGnͰ: U!ZjaK oӢff,nfvۤ۫S?)@^}`fSg@Vyq'ITc >}L KE6G:~V'fZ?t>Q:>PŁsyjIꤞh0Ȉr׏w?U3n} T<{{(g;={52>כ[$[_#< MMs K4 ^|v]ݿjWSvP> Hf!:6^ɼ4%#ͦ#)0LRW Rx<?BQ|#=wal\J{bazѡQiH$ I1Fz( 8#z$1D< N#{:;u9^^uvvv}N{`UfvѢ %HH%l t鬂D#\ <]IYQC"jD* %5з&"xys!F*l KX6gw"<pL$6hE;b3.T8Wr|@q͸*pZSo:/ڠ)N&.Aݫ +Hc~OQyg#71RZ~kI)Wj"u \ʖ_h mfǖ3;h!x$'HT?C"!̎`p$ɔV9(j4}3Ы1q^h$d|sGS$W9;t>q݀]~7Libڤh&zfUe^(j8IBE:B\ Xۈ.hTOGOG {๲P\=oPA٘1]űgS%.֗:$”R7A KH!)pLL09uM]݉J%~9fdmsJJMd tJ Rpc7;oKdc{p(\Be],2K?6T|b% !Vqb8ȕn.z{}.8ݛe43iG.V]'xMf3b[?3t\ f_o$>?OLL* Ѱb%Vm ꡭzcP/ty@v="WWPWMVDYPBJ$F (b`Ew}]TS-[=^"a߰\j02:3{p)m!Ֆ :<qɂ7dZl>h+IŸNzj@ s ϖ]L ˢQ%/D_!҈-dJSwf-O"3+ӡ=yVhj `GvsfghKnK/-(G} q/Peiu,b-E C#sroU}1W(-P*KZs͕RHG* l\}1WYZ*Ktk^R0O|Y`\}1WYZFv\e)y^ҊO`X1WY\7+usZsr̕Q9ؾG* ]eq5싹Biyt-~_C7WOؘ- Q5WOvۮӤ5g?\\)OәDN.b>f4QEt&qJڨ{3XɠF?&/?>*ahѼ—3() " 9fS͹4RL?zQ/#3botW,nQJNhk_f z;ϯU<ŵě9#Nċ9-o0_-}9ST ?sEʋݯHσU?Ҧlǁ rGZ. hZ. O<HB1rhZ. h^̈́>qCo e61QƐHPƗ@+!ԕ, ,8,;KnbΒ*;K*;;ͩw%f˾P*66E)eEFĢ7 '& 0 [)V%)/R4 mv:rB cE0*'U{MlLZ;Ш\ //(ה2t}Gqh0ˢj4_o=ՆUK9yQ|2p*0wR,. ^"%}YI%6%~~wQ qE2*bx@bVDٻ޶r$WmX\E_Lf1*&ڶ%$SHGD!Qd{CA̫:٬z-q:2Mg$UZ D,^:jG%stTYp.Q)l0qޮ~l]EQaK%BL;AɱP{</^>0Ĕ|6*DdB%,J9 `Rey॰8漸B"w*}uR['Rs~Cm]Axi:s)(:A+. f봳HTC15'UAΈte.gܵ3AS1s*D9Y99ֵزhHK#F5h2#7hÙ5`c( yuց1 z9okIgw:&]ߤ[5}淖c?*|or2HғuA±T?-aݪ UB$sMD*j !KR+oxܒp@%)X~X5u8T0TkOsïl#TE٠8)I|^7$Gc(XT%eYM&& ٍSm^Lʿ0FR\ڦZ$1Y[&Jk=P6s+A9Y0"^bB ˎD3":Jݶ-p,jPɈ.e.QM-J@c(VS5>0fKb:X-2PEv*6e[*/tk SV s>F Ab$VA_SQ3%|4Òv+F׸+31(2WX| Q,ht>NS m``/.6v@662}ϳCE~߷=NmB[-%tL~t`'DcOƴS1i҇NcڤMWw1}>4QP, zq~y&o.tY_Nzc ]P; \5##㬿}/_ @Y'K+@6L)@'>32bxN-,UrLdrĞْ( S4)ڞ'ܶJ6.fFjr-Cu%bb\!(1~3=ߕz5xj^^] UEX&6Uge=Avb+"9mSO"OpOˋ$ w!ETsAm9Hcl;Eb)2wV4>Mi-/RV)+y)bvճBnHMƭ+^﾿Gza|ͯΠmTqY{v}7uaܾBnm>{EjY]^ bn+%6sSINZZwۻ9 ;[fߡev{?W=zr>̦-ݞ:}t^~莎]죫ޒ%V寛[im5Y޿ӿOe} f;uyY0*9Ӗh >Y?A,3lp CXCb>PY˟蠣 gPxJHΧKZ[t9W"t9HJdř>&>?=d8<|~ۧK!\CVq6Vj-p  bM!CmˢMKBĨ|*Fg^Um F~~#N@ab$^kB! e`CL=Kyq6'jW0P҆å>?xʻnKAS 24TJٖ}"L.b9 2Ӻ'9`L-K(z1kJLVLIS'5gL=6L UiC}t};9ˆ7p-3=|_6|{lU\FSqN&eYHv9=œWv-&rZH+6M|9ב|NX5Fj᪊C38=vNVy0^{(xƣ>"د#[NF\2l7@ 6sD-*R#Ԝa\Q}Fփg!,]̦ఫ_ǧ2J7lO>S ^_/sڛ NR&bboLNXTQZ}LFReQ1:T q~3mLm%kvԂ>hE|CLXCDqF͋EpC`glnwTx}=.4[ @y׿j, $ҖjGZX |(eK?7Q;S4Fk KHMÑF@ YbPJ+:Hbl&rf@r(,#(qhk/LN7m_b^δֶ hNOO7&2dh,m]<_y7i|$z?T`/gOLϼQ  ZJFiP`>P'o3L9I9糫B\ ngp^<0kF`}Q>B0x-.]}7}ntM!,4(OWԻEk_^8]_~NmEH\Ʌ8Q/l)ĻbF+_\Ѫhټ>E;?opxzm`v0Ѹ8OƓz6~nygM#tes⓷/EoOJm)#Mà/\j8VG]_Q7]7p=꼥ym!m`+鬴/.s.ՉǓۣ2x'Fqo~|~l_7Ûͫ~y\\d:$rE/:5chC^1UC^qѸSB~ ~d3:^pn|5>%V5s|TY&K]2*@WTK @, U1FɆszNVҲ)+Vb$&B])DY" ȩ4(: 3V |=/Y>漹#9ªcη)i48e;Ut8ێ)K] .:{rD}><B9A< ,Zp)U`C)Bl ̱ZLb*|y\{cdLyG{()KZ*Q0R2B*JxÒS ˦$IfEĠs9Tjl D )04`|K X8yYlcq(1q$&qUkmg_]sƒWPiw }ʵ8y\/q\JVSbL AVT߷/.P$eH`4kqTh,`j: m_L?B=ctNI>)~$8  Oa0LGkIqcq.&ov>?!y3S*zFBN =3H|4Nud9ED2oF#\wV1a (|Tbţt%EAiSĐ@,F*1 (,TvYv YA9S~*ɻb߭Pbmjټ~q?0A 9ջN3kԎE :60h`-iF F3ߪDsѻ.B~L-߲x@g4Q5̵M~k3qrx>_}׊bR>0ucww&Eޅa/u%/iλ?&'rGe6%N_ӷ(}DOaTՄ&dc(spM'lRҺȧʫ%B2< k4jD {nF`s:ܙY8>y+-,KE_lR?\-r!G(9`"W'G\ q9WjU&9Y!<?w/'Q&ÊiΏy͗`ՙC YǼ?~~!#g~v7g{'_?)oyq5iB_ǟ ^Vےg'[ROY JR P\;]aqCB^ A-@n`4NT},}\BN @t]WxD4 aǢ`RQXj}!InkcE6f@7xvؔ'^ $W@I Č@)C#E֎V5Rt4ߩdFr҆9Mf .y]\kRzie|z_}Hڅ^ ;fg^/Tgʧ8HE¡`t85ٸ0 ?I(q%bOXo^8iɆ{ÄqRVّ+60N e^DA2Rg唽 ΔtUIpЩ$k8P.@ `JM疍к" ]UZ'pnkT&Q6dC/JU7g.N& 59 ieb/*ہd/_*OŨJA޺ zcv9"FOݔ¾}M9bq"Ue~;ُ[̥`"gNDt0Hrm@OPca %{'H$q\+(ôN[ޱErdp䎅}hr"`1wa bha<)% 3Bs46l7&]z'\ˡ˽olovS3 ZGK!zƌ{jd*FDz4‚S6R RNzzNz7 Y_G :3Jغfm 11k~Q>(l"'&2'cXNp96Ҙ njG-P8qsdNSf]PhI!gXblrL F6#J uR1G EέRȰIqGkZ ش;1qnw4sp~Jkk۸3bj:^x$V19E(Y KSʔ7\6-Ub,"T1XaЊ IqL "KцZVܽ|kG4PvGZ w>7:]HGS,N9A(B!ʭSU$HF N uJhM"`1VYET6ʌ ;NRUJnLGR h0ގz\wܦ(-5vEkYnɲT1u%$ԕbיJL] STմ/^2͞tbrEAEͧ4 l+|לa~); c^iKr1h؋y/k\fzɿxM^=U#= q/7bwax Sj)\U)uv2Y;·;53Vͩt~xq8$Qs-9˝R 3jd$jF~H/VP~\+߉hҜw!.zTs-ܬf^8ӖV0#)6gF\DcAI`?&vW}1AZIJZ4G֘JJ/JhuuRbDZu +isIj/ Ub0@JMXAuftЕFq1J)l3#=u!}W%.Ez~nt]%%۵DnꊵMbm)ujE]Up-v]]V]}Rf.bx~o'$[ r^O;˹`2иznQĂrd߽[kmnFe_vs#`WɮS8(DSbW8~áDR"[ `@tp kT86je@_Yt:HDb襰K%Z+ZZ`{_^,uN; 8* 'OƦf8ø؁>3:hWnepӚqQ7ɀw9BM0Ųyiha|\w#wd1&$σ7ʕ!Wo-as?Y* Ö4vcd+: =<(X9)jCy9x :95N6 (WKMHLB1jv |QRk%àg foL (nbt{6$KINxqn;ތnˉ OnބF;Dbg4&xE2" G`,X:4 *ƨrRggTDT aU8Rn+帓U1q =,eaqeH*'5[-N0 C i`"w0/T&u.U}(8Dzq*)Nd[2Ȅ#c8FotJ0(>ee6y-k9L4`0?^Aks0$d)ZbuX`?*3U$,?@2gHA-ϴ$AG=ba.dV7r,s_LbsT%( V0bҲ϶z7[oDBP'\@hpKK!NVxVV1?vY4a&~H^ tK*0iLAͪAL8Vo',9,]6r=V!-y}~9KÃHt>t-:z&t$4Z$kmJf;2ڀEFUztO>g)$@k}>*gͰ }6 on.8%uٶH뙐GZ6k՞xh v7uk_ɖ>>h~1-0T#+8؈0nLE$\cOτ6XLMt)*䖑suz8/PxMs K}mѧYlF_'O4)^y#a\['2Sig`=BNp-6LZr5[g0 XETEr+- . LNC"Ā/`PKp3@5SDcrR"!wK9 Q 51q6ج޶`-msWzUwj: խPjKw{jpU<<ʎ>Gx LA qẼ{ڱP'6eԱ5<xr[SkzZY%a3 hH(E$tZF,OҸXkMq̖zD@{Xpb$r+}P0S%gJ |4산 j}cZ(=/uZ CB= 5=ڇ!dCx #*t C>L>s3+MQkoƷ9{ZKR>'iR sE"qҞ/ rDf3&!(YՠSPӟ,(Wռ|Ǻe%-rdYo/[S+zf|]Rb*q}~ˊiiCt&\ęu$3h:fp>D3MΫoII](Ki̅pP=ZF7RUXWy:nߌzƅcpfRbwAdEꮒwq&:Ym`4V|01k^/D9~˝#.g4\aPq@gsom6INhVRw%*KDݠۛ5Wf?eB՝y1@5/;?ᛷoOwdONH;tZo'~geg毴8G L d; 6[R3pv93X@Po}pI#_G )<@z7w|V]K~XwbP)oUc l"ͬ{3GfWYI&6څmn{ؽ8 gNuJ DZ >}h3E*p/'6"C1*r5^X5k`/`x'`2H3ҁ=&<#S8)SF,)ѭkE^tt(u-)hu4 V0u}oeTRYOjJ?Aiͥ֍.=n1_]3H !&s)ș67:o ҟ$A+."iUjڃ73ބp.IXY8gPF9CA疒~i*1-lle|_pD[+.RII||{#؞qQY8\]%XKS!T&2aDD&p sE57V"FMsFV;|x%ć󁏠ϙ6Tv2K̤m;y) \f BFSg|xy2, @7f忯^2R"^X78F.\z'%c{;QtvWQ/wVc_iPq7)B^0(%dԳ,s3{/:3xe\B YB}oA $BNC˸a:[^a<ܻe7Tچ>W'{;T~U1Yλ;70Q٨NLǃbf|;@vl8DY@g =>`4f qVy}qWON.=]#ۄIJ98Y[(DkGQz>W# Dcp6Ѷ~iwS5+q[) 3Q"KQ8ͭqǜ, V* Pi \b8񃁹5eF3<)ͣv!B1E3V 6W /u4]7'oHFǤ"E03,*(!ʓH&=J{af(MV^Y7M XLFI 9 ]%ct!N@Y̳`l zTiEKQǰi=s.8jHΈAHZ*RʘQ9ǠHcL0nS= y x`\*&`G(*:m!=\:$UX/.V݈믡ߺʙ[(m~/ӏ߇ldJ3R(_۫o9x;(R<1R'V#I!hp^qg働oS>x0y-=QU:O;kDΛ䲺p}Gw{݋YR}۳ #`8v:)7N`tm/ ^1ZR%׷ԯk6P}3CTևOOAzm~wr1j[%VYjݳJiH{a!añKUO c1>pIR}ہ%3TVKYepDTmvn-,1},S%>UnRޑczϣ =bw0o`"Ͽ:W7篿:=Dk2 .ج.; @wN:M릚MS6h& Gi+*FmKYڀ(nH=:(y"p_U2x 7:HzUA4OBdD`Wc&rR: \b];GzņqKY J2/HkO`.xϨY= 3iP`BDvO=!hlLlIC![첳sgW:UmDR0i)g\qXt¿ Ltx LqY̙C:<Ncʠډ./mym77_»S4H IUp;oB!ghZwmmy,o{988ݗ^6ِd2߷ؒ|Xȴ݊;_٭j6Wb+9QQ١h彳|.mڻe˔ϏztS:=U.ɧ4,]2?񜐩wL^͝c_+s۫_>.>Ͷl޾#J<. %\xgfYq|w|T>!}*{޺nsf=|{gUbdp^+yJ-pv tEbQ "DC#{0rpia@iaiDNAu{RpFHx;Pi-A)(Z͵8$ιT F+Til&~w #tw>óܢIɦq>e 뛄wH1bo jV6<Ŭ78ʥl@?!:Jg%lU!cّ~syZBP4QﴱJÜٖ6yET&bF:Ք~>ZgES$w5ZoSb,ɲhnemrmRN/U]8~y35yMΩC]:Qkd^яCRSE많O}b1a̤5I d π.L̤Ccc% kGiئ݂U옧K|`;Ip>D}֏T߼< ftSm:c3B 1a_d3)ƮAJ[]r&H/Ұn\n9h\1圔_v^x)a?}6eL:J Btq}`4EzSt>X[蝽N;ґR=cI٭oxx,9DĎ}]4)wV$Ale˳4+n@n,ϲKI P@XBѺlEQ." ZŌI{N@vYRJU}&XeRL`3 FCv:PzY8dyyzT \J&C.hTE[A)+i%h(Ey'WoL{׮)Ȩ) R0w)\"HʾCQQNDڧYuh$ oVٟweRں+!L[‡D,8[B((Un)1Օd \mh1K* '&j+܃LltqoWiamrʓNLgl1O /_\W>,fa&kW QoVJZӿ"׋1H _kR{:I]|ulu+i=ozTjyc؉ܶ}>W3BOVݱ)}[l6x?<7. ޒnx=nbIV\on[6C^Cց[t}`!w` 2Tf]?8VٔR-!ܺݪsw{nzfw4ܲz~];4<ηZum`6g'|j>[ll:YwoV V7G4ވZMkY}W>(0jpϮgѝ/3ۢ[Q:_:-v_Bo0z ]/c@?lg\06DH;R$H:RƘdEFc[ $(q v{@vdֿ~d阺E,P d(*Q5%J`뮖/2_BIJ eP ۶IYOSLLĂ92.5VfWTz7(M/ClY[5}|a֫_^rREdO"%>"]R)JV1gKĂ^kK۴B9E-gEI`6!ʮ F)5 sf\6bjjG;˫ʌ]%ZV&Iʊ{t1c>Vp7â ):-D"XIt.&Q| \tju&fިߔi88ucDGD5^5^)i1j%Y-0zMV %C(eJ`I`)"z'06&JbT"3LHC2dURbK+1՚8{W@!:cq4E3∋;x YS&%BALtb]>Y[XĨJ#.>.GQNb7k?S#Eǻ )1Շu}fk1uxy>ki! 씰C N[2Fc Bv $BN QZJyWQX|J5)tˀ"yE!@.m Xux\Qrôs|V g_LV^s*ot1$ Y RXE&{CIbRhI%*X-eQsׁ>͊$H`FrRPʐZWk&~%f]{2W)t[|F{>nGvǽm]ں&[5tRcD)uq>NJ@A*B,@'T%JEHFѶ6g-MJ4"E/Ld§1b0VɾH@fI1tHɚMd2^&o\i |/IHFmT9QZoe~$T,5c<@IP(2Gϸeu'0 dHQ<ףm)P0SMDDV Ko@ eF!m e f.$,<GXr(!đ!qc Uަ!t$_BNdos,/5:0jFcEK R]8ŋm} =",՜/z05 cq~z J1hA1R W3*q#B*% b]sʚ8G;&clH|NZ"II7.DLDl` 6@'վ]dE$JV*Cj_0fY 6`OO:<WUJGzpes'W,uduޞ \ٚ6{ʂtWNI/O dŵR \)%Pp+\ye+Wړ]JȡUzZ((NgwO;W_'4W_'% WW/*y3oEWڼM]b[+* J't`"`Y$^}#I.]Ǫցے(2*dbs)teWϝҲK+KA[]Tp[ ] vwJϛ:]3+ ;^e_wV~.?𘿬.^ >/?i9W &->H>\uwN֗gq#Ɵã?l/kH;;xJo>O+k/߽&%=utv~qBڗ~_pv{偸WY_PW3z󄱫zm3#PQź\}`ޏu< $ͷ;͟S1W@1(or8uo̱%C5z.ؚXWˑUl. I%(?^y}!,;%VߥqݛqimRr']=pW9&`eY'Ŗ4GHW1]zP&L:4j^cu*֪Tza>\MtVMݫRyatlVǕԭ+o]UsuޣXBRMR̥fcהּVYPRMa+iU1'v!g5}0 ԜIzj35Ժg9JJAu@r=꜌%x~FkaN>,QsZtjRhK kmߠ'1X6aBzQi md# V\jh) AFUKP!Pj36.WP4 I;mgŖbScHFu)[_c +7% >{*R.I 0jA 1 ՞6tuGߎ`Fڨ/3~ E US`,ޢ ]{5PQAm>)h-a]Ƿ:mZq6ADUCˮg%@uPiP8B0uܕLF$W{HVp2:f4X0_4+XX茺p`Z R|& L&zZ%t^S` .N%9 1|$H֖֠1hqf[*YȩՏoT?/ZźcxʶMRB(NEO5.}r?8;9]_>YJC.:T&q2:rǷ m,z ԥ)Gm=Ŭ>&J|:&8l;wH Hq1 j(c]3m>%8Zζ$v( ]C2jB ;h gՌ3{v=`:, qS:fW`8 {? b&Ҫq4$C VAcSZ5p&*BȺ`3A7:ˤB ډPk>Z8_twHgY g&)HhPYICxΰ6R[U #\},e$Yn$F n W V.AxcklZ0:{ qt&f=)[Z(CQn> ;maMŌZS`ΓFCak7<LTg'% <%:`CrX)n?TF mVGPtE.WRc,w[Щz`u!%L 34 DCPAz[2\߼]oal *'V&4զSkOnI#GX߁g|AWLx˜a \2 'Y i8 NrlCɵa+f @ޢp \8 zݬ5jłrD,0r(;fAjr|nkH28ĦiŲs7 \- Х5s.?twW,*jp snY`j?z?뽸l skr.AZj^0ɛ` s᧟n>nBoHk48&0J1cSNnϯyco>bNR6حJx ro.dwggWKz}iqmׇ9'=9V8m6Goh_+Gx >nxͧ]7sGOomnn3%\l?trk^f-wRUam迵?2 $ @Paw% $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$ZZ?W~:[MUo׷V/wW !aiI%gZx۸_X\܎C@p7-hR,&08$'"K^ƻsFO'dkMZ8<ȏs?sxHp 2%@+d.Jat >=e&DWtN\CTJJAOWsytGW'HWJ*ISZI5t(JK˸M$],-m+DFGWCWFQR]UZttutLJ+X:++qbm+DiPCWzGQFytjXlWU36jҶ̺ Jwtu_SYO]o$/Vxpekx]P_y&uɼjGr0RgGOeth9,ޑ9eF p{j#lۭ )4T$C;6C+ZOӈRюOʄ 3]!\)S+Dkd QnlGWCW\KMBt5}) [WP ҕ0]!`M+k+@k m;]!J;:Agzl6 ŞUnZƸ`mFj`Ƒ9G6£O .׳{?WӁΖEr(}柂e~1d5eC./sȍ,(6>{5Lɇlu%8`W Z荙wkY z ~[[vjEF{_ՋOp/mKs΋\8rB> 8Ƭ=$K[{)ܶ$$ڼ# 2U~8ĹGG,m}HE O@ʲypq`fldmkدKPVtr5,^x;lJymВm,j`,{(M?wPL0kUŋ''jP#I_L]mAX|R8A}d0M 7nlPh@0&e&xARw (#)I)Jԩբt(M7O|tf*ybI:e$K#ZNWmwt(te,y ptU:k;Im=]\ƈЕ)]`~ T Ѳ;wi-]EozJ҂S~,fp;.]5C+/I3eIæ]+zq9^WΒ>gh÷mnk4 4D&CThѲ4iïGOh[ ]Z)U P*;:A-+A=^2eS+@y Pvtut%&2%B6 RbR+DZOWw)ҕ"Z])JxB++(Ilu(] ]hjTJ3 ]!\S+D{TOWRN ܤ `҉]!\LkQ3ttutCMʺBV$CWrcet(9ꛡ+m0FcW?B>t =&2zGP9]َ+zăg*B<Bʻjm]=]qm6D[XTn߽݌)kiB4͘&\'CWTh*vFZu4}4͙U,%BG?]\R+DйUyt%82%B6'2S+Dx Qv1+Ʌ)=zzl3BWVTS+MƺѬ](u cݑ HM-ɜ֬mAm<_IWҮ+F⤢;-ɣIm$wRnɋ-،p.6V$d\lDkZb#Jͷ9q]!`Ó+kK[JWmtt(teIYWXc]pm*thU+DS++ow ]pIEZ %שׂo@;꣟ 0GvetC[)vt5rZ2e}L4uDmK[XcSilMt5DXiQj 4Lp]!`K++'oCG:O!J;:AQ 9]!`pu2N0t(5鬫S+! {_BJ&CWT r"NWڎNڄ tbWfrz Q<jG" * ]!Z#NWe]=]iOlXN d+DZ "J)ҕQj]MKgfr ]!ZkNWnf R]!`]!\ ]!ZNWto莢=ǷGޛ!\sZGʻjRetE;)7f+}zDYn𔖭 ڭ*!F$Cy hYiQ 42]!ࣧ6*em+f`] ]qMHǺBBWX1f(7L::K++_ CWXfP.fwt%5ڴ f-GA4k4r K K);N HȌhi(eI9ZFm?;SFK>Jm8̮$\_4gB(W^X޾} Amݯ^sNiAAEB@=A;9/y]˚_>ϓq=fw[7ݧϜ(7ydWл[yj}3~*|+yJvc8<"\-~:%̆ nsfX{bU qcTB,X]*oeJ,P"6J<8n eUfSW1C lGrxՅcR*ܢD+~:؝uwȧBi娣d(-<(]pSjEiݩQNkhڥ#j*+^RፓHB!PF C0BtAtTqr9U^/R2Zb(0' N.i`%1} JFA K4Pr0Tww#,B,C@r>o?{fo\LWe8nx1}%?}g-e}ɜa5͏ 7 1x{;ZAnHo5!?4+faX.g(\@TYaHW>$ ]{8\Ubxݫ@8oHt\$VR{M.5\kn?Gi8v!Ŭ2هx]F'j=](~lzyuy)ǭڞMUT,|sݰI Ɖ >|mFa˘-Z<>!Z~l6X\xi.Ϋ\OQ9Yn_v#8ƣfHAVvGO!jK&!or[5lk5lfSU^C q4 ~EǣCW:Llk]6r[ڪtUkl=+>ߏ'uzVl\fHgG3lB&h.wOOFŠc|||y5pCģ+|rV/Q TSk4U:o`4tE|⇗z)3` e= Um~ UO [V͍ءfjPIbjw2^-ҖI8nhZGrW2'&g HO:18l1~e 6 2FA%,j49v@J u18 _s l>]t)>m7'7+DJ  F[\\hqN{:p+6 ewF8Y/X[i< 357x" c9 ^;Bh 02  !JҙL0KCQur08M;HRm1,J3F9#dK4 ihb;Fan@twP}*+:wnŴJ{n( LhE7y /S%Tx 9Qiӟz"`,YZ0/ K?XYP &+Lq``pn|NH-98;q_p`=h?)7 R5ټ?^ *W oؖ hY$NnGܭfE<GRbrx*VU> \ Vm5P;4:8vё#~cC[Ge2ee}G"iQe lTX ugߪHmW.ʺcڵg@k,F]{oG*ڽX~0wvqNC#S؆VI,5%JZf35=]UvbypKUH+S+J(hG+i`mV w"^8z?mڿ\7/ ҹl7 nlщ[ŷօQ_Zss]%] ZH끐rd-5>5K{$g`w4N5Ml,aa7!dW |{&ޝ0؂.jwv栾)oҳ`Cf%Jr3R{!SRt%]T*f]HY&&s A$rd+{ s8I؜Nƿl>y!oq OgqW9?.C.Ģ[q8:+'>2J9%KsR*e9169pHj}%N$BY2C/@0N*4,ew:GB c6-y gTr4.TVeibJ"Sq_+'b…+18":ZRqe cșױWo;MJPӧh7lj^i] N u] ŪHRSe]tK:o|l-8v!lN#g>Ml`6ÓٟFS%p6k-j]e HfzP=5rTyo{Cx@ V bc"9 d|2&ᰭ}h5v7} 8. 3QFl5}b9' ][>Z;zk-!e.Ѓd?|Hp=_ÓjA9Mg3=%5GǫWO?DY=o9L_o~oKyJ/h-:980<=>f@:M?0O9$moTѷ=B2 24`G,GrpBy{7lzXWMNדl{;Ԍi6:= c/K{ߧߥ~{C(tΊdy;w\ |5 fJeieڿ칿uT4EYD.T˯nC tu[Pn!NǧCyFw0H[}bL?X=I$ucZc4ǿbbvLBgy.G^AQg6)W_\sutN'4lnwwL&`LP7}QJјt3XtLܴD-,%y,qs[M7s ͙\xݳk {&j0ЧuĄD*etK^@.kO,` :B5{Jڸe:YD]Tą{/ttߚԖʟ'J?P)zt+y3v6UZg)XL xtM)U<69;ʑV}=خrU w9dNKw;X/)AN@=uTS q}w.: D򐸎>rd$2|H .P` 2ĭFM[ag' }I<g*f^gxvS EENo<#1`+w!F]׳F1M"7H1/Man`һ*{.St4XI(j0 ANry2T=rWo;C6m'*80Ro54dF!Ȳ LX^}nQrd01o"KH,Q'`QHCAYsS'd⤤=2x{޵e 1:3)x! !(!gKijl=Va'wvOMlI3[r9hap}+ qXNr̾!l-5s+q:(Ҿyޛw Q9 Ar"M([Yh 'y ha:&LI)dـL(gh齳2{ _ }c+ A8nNr؎^(q |r~_گwӶn{@POpա^MeH4IͻlKyb(QM M27R ${^\]qG^*DtZx+FWr8K!I,3<%eTuJRC0uz7/T: y9?ae2vV#zw IfEB?>txEiSvɦ(~+C9+Z֗VG1b2j֎w9Uz&fN L%c i%de#۶qڤH@?{ ,3NR&EAh3[]`A hÝSvD'i;>im3Z> fG8TnGO}3E X%#':LޡhA9Dgى=9ޠ$J}2C*fgׁۄ!)w IY[gO v,o0ܻ%5{*!A*-vE TFA g0{u+QwBFN@uIRD[p#',WFfP"Jz>ARhWvtw=t}4rf%C>"4b*:RY`ʬ!(]bZs3I_s3}{W_=ڟg}Y-mus9΅RF`̨,0(5nD0&X7d`q*1&p8}@;fȠe^&f  &a9W6ptVuh]AZAxi!#wa,Y /薶Qd#D1%/+Y5rng-_}zl.*^ 蔔FLyO>4GiK_P-~d$-p R*Gx8*7]:F! I `-7I'A < <jUZq6ܧ{;a~lP  YPf/Rq3dL[a\W=߳_o<>Xǒ79-,ϭ5:TFR]9R6Z}="DRd^t_-̷t0Ȑh / 2S T3(ZL$MP5q|qo -wܡNгiQaʅhHi!O)R $J" 6y` ~Q6x8I]`Z!)D fhԿٻFW|*ԑu1bÇ}X;ZQ V$@:ߝHp `&Z׈`ˣ27Uи V œ~Ii-eCRlBR%~8eF-b[P|XZ0iC(\WP.]DxV\hGf {Q=4NPr)oׁYJ$|N}-8xy38~ ._V<6F:[(ВB5^:4{Ằ?2ڵtRQmP&:K `]୷AR ҄d/ɱmtmikd))Jp%ҤƀѲL46r(:PdNet.&|c՚BiVf^"GM"XatE;29yʍN@ByF5~?󨶁V9n* Bu#xEuL௓C^Mo| ~A+LjXqG^_$55TCqqPOZI Ly>RjqN B]Ux/ᠼ3+ ^⬚N<4fFUNc{Vhy,vLugxH0.ewrYtY]Vsrh20._?sTo ޭ-X>xemR(6:0)'ͧuk^u tdB OOW˧ns23/<9udOVڼ٧Р'^C{{>:s5#}O˯hx ^򻥹ѦǛgϼfk@XC]`3 ` zaOOK_b鉜WĴ˞n@*V8 p'3C+@?3^zm|9UqYV %Ku4,6s b2!318w=mo./_ynWw?>9 ŖLO>$+@uk'/*r. <p%fMB UP\̪X8_^D3$FH%EA` WҨٗ~.\bmmxU_ sUVYywmފ- `% s)Xq^[Sp<`qeO(l* mʤH>'[ x)+Imfl N S*qak+P.l9݅(>8Yܐf% * /xx͊MqɬJj[x$hdA qs)r40Wie2|L ڔDĩ!)h1saضKg7c$maVǡ kk\h-|wIf鎿䖣3jlƅt0(ˢwaQD͐ZQf\2TF#r~C*aXo;5Ukpvamӥ8e"mCQ̈00Ox6:%JNJn ( % 8JIDm/ 0VYJCjm@rW2xh!묹ƈ@Qvbfď2Nu@^#lmʋe^/dDR`I3t {wp#8vżm>z+{G}i|"Qq]])=+l]!\`1 <]J7N g^Y/k'}4RoF|M%y3I8P˜5GGI:cf|1wؿ+4hP-fח9T^_8e^iUܘ>Z˨ %r~ ^cwsN<ʚIuA=Hy+e8\wFg?Y'n4}|}?*Nw*j ގ~cfPjfY Vy^eȐTp(jU d{8i TLl.̶Ll۟:g?o%|m ' Vֽ :F:ފ(F@WCWV1Ձ+]{6(&uw$)aYo'ZBu z3t Ι\_ tpd f^hͱr\' +Щ Ԏ]`/;Z3܊WrLh^p MZNӄROGtE ]\BWB yP 9 ҕԠu+ld*"tS] ]LGOr\BWV+l?\ tut sL0n+B;]A:IFZ'vGutP*3 ҕ1t~ ]\kBWTMJخZ ^q`jG/tEhtE(ovt\BwEpUovwE(@WoTéW*o XckW5G -#]퇲k(t:t9Np4?g/)\Ym=~Eс674Mp- MZNZֵ M M FIDPtB@p`Ԡb:;𿦗78+(b]~1U_Fy2Mw\O{wMU[M׳yk4y5rfɾ,Ex|vYڒNvcš kZ5Yϝy"Wϙ9~zs&BYßiΦEy-.F@펞ٻ/y9~NF1:zZ!ĩٷZh㰥ܡh_fk?#/׮]G]藶woϚ;^ &|+SwXk4p%zhw95ӵOs4uuM;OILy>BW_.GP=uI XΜ][ pH+2AOl_zmHMBxXȗYoFow.~_ɜFYߎru\X#w( z%hٮC NNJ%]!`xor ]Z)NWk] ]5r\G~\:o̦PiYx[T_T9Eϲex ܎g *~bl}j`6\i.PFUwHaРߌΞ3j#c(bsIjכ8iPic x0Ѝ-O(bDSΕn8 `Z%;gL2jC",C g=sǛ8cH1LӡxIꨯ5Moa|nnuŴ:j^l| įOuVuby,>xMˆn|Ym$@j+wC?Iбۛ%wBޕƑ$_)%qhbyˮ8!)CݜAW%-V6D*+"<3,m3=kpt>ͱ͇k& I|6=ylװxr>yMz'S[N Oj/;𮹳vއ#ˢ$]*ϽF Z\d,%{œQRy(>3/]"m <rneIהpbXee*E~]Rgsc>^adR-#O?sfCTL ;5}jϬlu_WfwS][DegN#8[5BVF#grQ~(?$sSRsZNkƺ9?k8-)ߗ' V?]LdiI{m0ﰭ\%6Z%6 5A]sl pbY3ZWezt)$O"&Z~'3FoO.tTX'+z1/2Z{b^CWC*{yu8񥗇<nz!Beq(t兮:zAtŀ_ ]v)thwbV]""~nkdz[C&3NonA4̀_RhFwf_B% +b.Bkwb& ]B"c2 + yܤBW@ޫ+Fyks,CW +L<]1ܸbhKm8F ]Co|+.s;Ymafk}NEp}:(Q34.(͟ߖn_퇯M:S`wf7oNjnD=}@Gms|?Ɲew`6yOvG4mw5y_[xʧ^\M|y7GTT6Ƹ=8zgo WSzwoM}= ;:jW0SwĽڶf?!oFjl}0Uw߆vo>,,@E uCywk~BL Ӣ|B|^kA2C~DCos$`w7}}vA>?;OV_8PRφ5[k\]h[v}I*]un:[[1-dYć/n9z{ۇvcl}x7p){3J%=|K()h=_Y.E7I[ ɔ\HnUK%htZBn3θKi6kŁM\;:[;cHշ;i kkP fN-K w䡊E;ꭢ͙!f-DzϽ#BMTDzo%[Z`$MԵA3Z Ct%'+>'Ѣkecw<1pJk: kg 끒rWj9вb>;Lf 0kD34p'-Ռ iTUiԒAg@'5NoM- m*+M݃66YY2dClt)CQi=ԎƬr61b5)xٻ0F>8<)$(g@h%艤1/NrdCU,Z:d %SR(U-~h'!T {Λ4Ϫe7-o!Sf0j$7.(]cpπp_c;Sknws=i:wK9jsi g16c@FUy mBiLHnNʗ VCJ 6Y ЗIw0k]ǛD@K$X'5/V[ZQW!5*dlzaCKU+ܜ-dT NLu87 d&/sAGM-V│D]`#)#MEUP4kϒ (Ez@?Piu `TW $u9kh z|tgjd0Wz,I KV#5:)%5s͐Ȳ(n i${jE}∽}Ai!:3=aP5]/ĥj*A BqŘAQRt8!a%ws#vlיxa].yTy<w`X*f=f~pvM&!jhX 3 b{PT8xi>o:@G6*MT$];-6*d`fPDd4XP|Ѭ`QBą򠷒!HDNkk*fʇ`)C#`8ӥi0F_(^b u6B[!q2p /U]UBNu~d}:MOU;:@vTVB]+!Hb!ҧsvrtM*_ͻ@|5@RMwUY;Tڠ@4XAҬ5MD9 Z&PZ #Wfiq#0Q%#p45PzB%aPJ↑$5ێ4JCi1[.ՂʠҘ"&b9R Y 5ɏ8Iz R DŽ^Zrrr] ]Q}jme"3R0ς ǟ~y݌ܐHp~M`5b^BOy]]\>޿Vym>NVRlحNj<9y]GG^>6ͶË1?/ܜ` wڄ Ah~w^VxryV|ۧm$V:nY"Bۮp/V!ڸij[K 3y]Vqm:}@᪰kIN ^Ϛᆰ'Цl (m!^.qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8N gI%9;- JU8O8;)qF'P$$qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8^]38r@b@1j 78^F#N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'q}>{.tKM՛_vv\_</d\`obK RK}7.1ʨŸK_ }۠jAtŀ^ ]1ܨBW6׀2*+t * bI-!NWVBW6KЗxI+Kz)thb(omd"tt VjQ{o?nJ/KWBzzJg`x]='gJ{8_!0IK/Y ` ^/yZ#4!)PVS-,K{O[k[6\CW]*wBZ tut%nt[6F_LF3R>Sp=x1p[ڮs8YnfY>*>oFjhҜBܼC+Lrm%ӡ)+4լf^ }z*hi3NRrT%}!gt(J0\Q]`pb b ъޫ+D)@W'HWV#?]!`z2S ]!ZNW聮N4ZMu'}tj z5gPP5G*<훛E8"q͗IXoWK_Hĭjp"Q¤/nbF~nc֔;ۃ gf|?6)EY.1Ԋ6LZ}Yb$VVKQcάog알|9m l/U몪B ՟ϲ^{@W$p*Q+, u23_fνiU}g}|V OhYuyyNgt/dܲfRq}%Uu1W710!sj1}N>vIAB;% x\%e$T8b+NńWb#ڗsPR6bLDW)g>ࣘ @+?=b+#,j/j=]!J38E^Ғ.l)/.t(ꫡ+Ī7gD(I`s`\Eqڗ^/ڡ=bZЕЪB %]PdD2skZ wIW|4FBӈXKZT}5nWi& BCW׊R *bNWځNİ S+m)th%;]!J%:A-0/G]!\QLսS+)k ]`H1tp9+}0(@WCW2ژ早^Aum,uޙyO ~2*rIL樕'\q~[kH j%%7{3.޽_@>NV ?]xccx_y#A𼯺~ܶ{?zr}^NY7g*\e!L$O/[Jr;l z'g%C`Mi).V^+"[J$' BCW3(h;]!J3"]UQؔjŌZMX QҡjBA)W "ZNW CWUoψ_] <(=BV()Y0h[ЕЪB*ۅ$Q2fv,șdX]H"ZjC i"vHWRh*wF};mWideMAt)=yޫJD@W'HWMK+Y1tp)t(@WHWp+xAtM9}vRJA(;]!J6HWRShO{~6P|Yӿ*Z2CU2ŜcFk:;QT*YgyTSZl8NZլt^ BLiUI(BDb#Zb#J>اb+cgR 2wBtut-Q=X4fd+c95%MCWɷV^]!F0Е3hAt5/)fz5t(vP@W('U gB}jK e{D{`%Ɣ>Е O]RTݽ[JɈڱdcτ"Hcվ-͈TD Rhi;({v@ӯCӜ@SC ] ]?'t(UvuJP).G]!#t("]IOUtEX Q+*udj#U+\g'v(-tuhSf*BRBWְ4 tut%dz1c΄ ;fo 0{c?ic8)2NӀ7P4}:4͹$#bT%et(6":իЕZ]]`-ʡ+[Nh;]!ʾ7UJ "/FR ZiNW ȠN4(}T|5F"rTXeF3VyRe{d8OZU2#}PeCv YomĄQ6 P-.#$܊.j{n.{޽{1 W&< \|?A4}T.[?)vike3ۭm/_3HM':}7ocg(x5|:f~tKN(Ɯ֥;;X>NVYM s4l}5QZ j/1Ɖz1"|4DomTD٨h:&`~ngwk ^Ň .śkL0\lU}yv5>X\F WX[fM8&>^haZ( "kwv5AŪ_ͯYj`ǚfaw!vsmTh[;/oG¾̼'7ӫM~r]"nq|S)P!9_F.'<\-Y8U*?ΦY-u[6nt~Ƕ U_Z6*ݛ]fr͓RH| k3@Hq<4M>&1ζcd -}Fc*$ƣ($tZ].3ұՂCo~%&.]$nNy>IbK z:9i5rXt~N|e(6b*&~zh zRƲ%\:_O.WhlM\Ңә@@~r"@pȌ&_yYE2*/ #okC?tGlh_{>,&xF,t@X ڀiCaiNEL<%ΜJR♺KI!\.pR$YJ6 202^L+Mxf{u͑QS,h4w'(vmp[~짪."(Vk!#s,,hWuF= fe{/(X12!K %Y 6p 2W S*;kD{ZBTI'˘fIXJ 4c/5 4hwi}d@|X IMd8rU(eG l$4JM aKMRy.> Fq$A PCL!A&0Nə@嬡KU ۙn>-6:, P'NYgpg_Ym?3T1 R4Z82TQ%eiP㸹?e6uuT xt1t7D> 2 >3+25Ueޏєc <͌E :Za5!N:~[~`f(PΨ2g (ݥ~̙G٤=MVo]1E='^#URRёyЗQF(@`ƜN.q,򙴱wBC^&n͔"͓zo'2}'C홻Xzϒ_VOi}9mHäòTk嵁HLx"L <{G,P톾{D"T/י|c{{h".$;Xbo)MH9;A8á=vb Ͻ,P{8NЫ%6MdpBHU?Rm{ᖍ;of>"sh|pi(' d#M6ZV JE1 \[ <ﻃ J9r(!Uύ'cba/iZT\j]T/u CMLҨfu6egЙ͠3 =b";XԚ̜e혎$s} G: 6:AI k\dDrm=D!RcJI#dH"՝rXWKf~9Ni}Szh9g&a*'pYE(R"$˙[CTeW"n\&5)& &763sTL-ђ( Η}eEbٹlYR_t~<دjèp/~?qhԉ6D%e I@DZ(H4i\:X|l:-n_{Y8KS[8THg0zq E~?*. s*[ le$|fQ;L&g=Дcjҡy 7KAdJ<zIyMQ%G,iHB A!^m^⟹ ѤdfoM)zA[* dhmF1_۶R|30M`7Ab ں%${9b$[$˴-ϴx,5fOU<j[#{ GD%ܺS6R8B W^0;C$og&'ɩH78mV'c1HAI ^:!},_]=GT[I9dg7լ[ZMxc68&s!c͒gCeW5=>'ib(Vuɷt 2kH21K߳ȼ33rJI=ַxh"D1 K۵݌]N }DߪDaņez3Q&s7X'xfsHdmY-ybܶBm2Bmff}չm8}[77ttݖ^_=vpEvfS!d=W{?gH[|07>sیy4ۅxnz{놊gm#y5aМ?bzS]߇emHYx#miSRsȑ6ZiϚ MLgPEKk5oTkH8H hJp(|鵕:R$CHD "RzEHIt>wzd}sYT`LJp)N1'($%%  $Ecyˢ*KByŒIG̰$h+D<*b\pQH'Sh_XQՓqC-4Ɲakq&V^_}  ?x׋Zyl^Z͂8oM: or.@+joXUrFm)2!9GxR8ehz^ q,tJ`UdR*&4R-c1q؁P FƮ0-|T[3ȸezr=ӳIwj{v6x6LtMPN4KYn.8t `t.ocR΢WAQv<vCvl`;8fKQN A1TBߎhF&J>"8- -sovѱՆj)BIly,"P2; CubB&R==LB*4R@/քH|>C{m~a]H"uT45U1q6Ʃϻ(ƃa"z[u#^kFMU J&@)DT$`E4:Ӑ*@8 $( XveTGO!,gΗ)]LiW.NK,6Jv]]}* @OX"C )QH8%\&nL|3m 6ϔv'J3m`!ih2B!J.(½rI`=%2~#6S S+M{]YRPY45 Z,(+ȗ**>b=!xm%*N5ZQTXvIa'E^H!CA8*!b_H4bۧn68L3SϮՒD 87_ɇ5L 'JRa!n~&6:<9c(Y|JEmdC ħ?6f)ѼO;ttt8=8U-$>1eЁߦ68iY<8Ӽb||1P﫥'9/핷ͅo./f+1\ si0LgӶg[j8x Eמk %:[fXg3@=Tޡʆ85)ΘgGóyGOnktVj׳*@Nrx?5$7,p|?\$VTeqHh5j++57@_ZW?^~fLړLM8o, < A< URaIgU4R3kUH{a{ST?<ͷo˿}s~}{w)o<|/q:=\_$57?ޣiSilo4װE J>÷hWwM<4&Dyb#YxICYeb%V6#)0p \C (WФx EIq}{NGRD# 2&0'齣4 $ʣfc$ $k:Et&qzy7Ďoha]Ϡ}ΗqC/θ[U}['qv&aׅB C^\$-1xfxִOٟO80E5\@ m-ga/?DZkMYᴦ(ku ^:-!4A5!R:(M\XW%]C8U>`?Xrx h5*}B&/Ht)e3 1PZ\o̔e>fc}cٷuk3}`sSs2_y;E-k܌89@jcňEmk'A1UD0+"{S(Q[{g;IDL2IԄjqmW8|,F2&LqǹI޺iRYSOmyB.*`wItȳGpT!NҧIg3k7VXs'4fqpevbSr.Ts}620890"裡0'Cb Ԧۡ8S0%G 9k$bCcS @!Ȩ{ZY@Iq0=.̈́5Go+ޟFEFYkveYYZk)KDX`=]#={G c #)s4Dڂp$Љ+QŸ0 ri,UT~FLJhc\?n=t^n}ZHEsO5Z S*ys+:'YS K!6d(DZy,φqoz 29(-H"!i#:0 ĪkfYIH+ _!Cw<,Y܊_?ݎ[][c;jfjtHWY/'6$B&Z*v!>j^A]tZ:x\cXqz.1WȻ|{ÿ |-?^ Is)H'g~Ȳl.fu8#U1wݪAuv0PQE-R@]z ԅ ܱq&' *d|V~saT׾$0_G>qFEEX(O4TO$H5-zm>Yxׂ,??X*ts\s]6>=;UYN/@ܚ#s#W0)H^OO?V7G)~8ƿΞ}⅛9xw\56غFD:^ BS8K}?~5̦,ᯟW~ ѻE8|5a|:ūC|m$ :>C Ŭ\\\.kx#>xqAWMZJÇMe`6#~arR!? _>Z)2ȿ<$~kC&p|?_/(1͌ɿsg3`Z}9ɭksa`oY\нG-oȾW>9kr\LN>.hp8q˒Xe5hqׇUW> Ωv.> |iYO|ғ_|aN,+~>hcbcy-ݳ<ճ:6IcUG?UIV"̲( j") 7QVj)&Os,2gdY;Ӟ 0ҪY +{~.(p#M Nv<R!LdA(*$J z@G`H^iml<2&o9&Z {Ϫg_4zQ3~14u(f_}9/F;TvS+~]j%T)єYYհGzK{ŰGzWl[{EK63L51hvZ1i{ *Y9IPG "jĵ1[QcDD+* D +DSV2+S&nkr_s D.J{n\y~&x.`wmX~8 m>NpV~zӀ'%[5b{#{0aY)|url|^^llƇ ϵ$sH9G,Wɹlݎ*2n_`Gv몢Gy+rd']ܭ%x|ZfQYo)b\Μ7咠OZ^oncWz.:E eJ*Y p+Dp|AUg6$|8|mgzx]Dw ӑ c)8snElYd\V1NLwmTe%6:Xlڧ=OI{7IM{4IMۚnW}ܘZ>j6+)}HW1!RH"O"*  W1=b_łrW1YU9n53.uVtutA*gw-?NoIbC_hJU$W*JCPF2|iWl SmwU)Zm9ji`Zi"z9iE军waޥYX^\ocmRk| +ƪt߀#ӸlyYH/D B]Akё) #-#Sb Γʾ z;M!E%&.CVUJg DWFXBS󠘡A_'P}}%ϤdBGh:7^Ke[LL=Fl?~:ƧpwIocC|ÄhT/}t  9*2q[ڭz[n.n]֋ݺ\dVsØ'TFr5E4Y%1TӁ\J8.pbE^'mmT%&d`$PO1&$潆Vt`kr>.o)LQ#w~bE 'F0"T)9!%"a1Vp5ۏȒR,ȈG1ݏoTL0 9`vmU6SY1*0fh)26Df+P-}j+JMNf4 2 P }\3jop%Tf+,u6BJ+TMq*0gJQP1 2Brݶ9P%pR0PQ Z @-'P%WZSyFK:\Z{\J=]#4Ale$vThq*W-\VW(Wg+Tg渲'jz:z~D{g,ݏfrU3GTX^7pخGC#CF6A c=p 66L\Nh.FӨ{if59AW(%fj%;PW(IFlprWvpWRW{+a4#\`#U]6SUxl3} p\I ^?.g'$ʏJk<.<'ep ybf6zwUrs_ gUi]JNRz+aC>gNj|R xْR7*g\%6zXj;LXJBQ&ia0Luv5xm% c%RpRVX0'embYjŖQqA.XبоؠaFp]leU6#\`yfrW U :jqDg+,PR w\ʕWƂy2 \-s ղ@b#tp%{Lp5]hiT /\HՃdkf<\Z+Ti؀}ĕ$t%_Vv֝L 4kD?u$5gJkj@}x10 wɕDiT+x1*0 |pru6V%|#WrۃM H6B+Tz+T=ĕ p) lATLq* pWe+P j+T\#4[%+,h>  TJ*\!  `jGB+TEq*oL+lI> ʥ٬ Z{ Uz:zyDբFM[ɥO7S;=+.,JzlSΔ kJe$\ڮRi7S)ɀ}ĕ+.h~&0.f\f24 AB0 riNp3\Өr^bsk5W\X6BZ+TkMq*)5WBGA2Lg++D.BN04bJ}N ف\uqfj'LeI 4U+[{SvQ tF5lL}k~eVV]},Md},{dl2gԣil%cW=kԳ\夲Z'9`U]mf*Va~pwb)uNQ| X P&<J>#d]`z+ʵ\J\!\  l[QmHJ=J=Q4]ݸj&Xw\uxH3T@U p\=يmN1#, ϊ(Xl(rɅҠVwJJ:Pz)͔Te+;/xPWVWRW{+ 3$#\`Nm6B vUJ\!ڌpa ڝWvpWRW{+0U 3PW֨ UG\)lprY6 U-yڀJ+AW f4^fen{'渪U>c{]Ͱ?|sP7f??p'fVv)#hݳ[p&{<)eJz*gr l%a;߰ǷRni uySp8NA婃QP7oыz.4cXopd|@y},*1&i(&wѐ*&wy:+q'!+> b;B(޳6rW~FC>8J` op2+$:qHm7Ý֎!X+rجzWWUô6=V> "6{d{?&ڀӃle G)E G{qk-4Ķܐwx+,2oZl2_Wlœ]QT63Xޗz ?twlM??L[I" 5"@o*yF֖d"l3qWLNY/IO&my5^6jG#I >](fH]#/N9ؗtC:Lf2ȁg 1OLbT|~5H߯V&ic6~ M\|*_kf<ǟdz8[2![<ɯ i*ǫK  8S2$H$2*b'З~;tNZ~xgrb 'J%Z&SbK,3xFxe$^ۨ(N|EQ/f&gn Q~f\gԿDå~{voz>Y\.v۰*o/^>/qCf@) wOzY~'e^XqLb*MAqU,UB$\*OcqJ95(yjf~ZJZš"U#Ē,+hX K4Z lff0RJ* ϼj0C)W;6a(̳wKUqϒTԡvd,m" /ES~(սl<ӎN.F}bEdȠہ9=9C uOP΄VUQLN_u,9PI!Vbiݺ%ҕf;x)s4;{g?!{*ZϤ^~b<3&_YdŒ)ġy@[v*2K.rˉXԮQ;>c ?όY1yVvMqũ!nQ&4n1aňȉQoO A%$CvsGeb1 !:^D֬vCNP_cOzBNjei_ͥM!`RTot],]KĴ`Xq//H Ɯt/S`|t/2ԗu;v$< &`2wlkünv^`N1Խ}?e{7GY0 2ĽvgWȩ7&\S5ލj4޻41K2I:A`Rd5Ӝ&f*M`(T4ʄe8G(+$Y#Vl~tn;75<4Vq0FgznW\s}sJ9 i#C *0#"[玗94b堸6m= ӉSL tCIz}z"ǂw:0vѓ[;."'F@8[|=SK]&^և^}aJWnngq,40BSSUfk&p7Mέݸ aPn\g^ÍkPQ;b/ 1&ǝ[*;gMVDC4g*ӻ~n0,ٖim-E e nj`vۡ YδP$8 /E #t!F^DƲo3 2룣 %;bYB`ʑMy^*GTt>kmU;2޽>^,U;ʟ.O}V"c$HGGaNt}V/ BY1rcSwg} T}VؚgRscW#SY}ʐگ9T΃ UӞg΃o2 ۃ\0 6!qi1ʕ$Ĺ`1- F)#L$x!)"s`ݙ &I){%Q0= رmCE/UHu:Xd? &h, 3}XYw;Ֆ0iV. 8y39];?U&>a;=OIlLHJdBj i-ܽW[b+˻HRGNWv ժћ>lʐed'^:/QdX.L1/ 1!]{>T2ŋjfS~A}z38?'|>O7b6̒r~ U*ۇ7& -Y{H"X(/0 R{!f2THtz! aҤbY\5 }#4HZEMgX3-fՁwl0 ԩY$6φ8uU}iȗ$Ef\ʲZnYR ݛ8; @4RpuKw{lZrf\e ~\M Ԃ=u@`AND΀Wϟ-.w_RZfZkJ4"e#0jӕECiH") `)U}J?Nm>TMP[mݥ|8IE*\hm+LGD1u+>|CP2|U97>_\XF@hOfh 7|yxt4⚞% /MX|UZ<[@phpFpԢd>[wx5(Ѹhz.6W<$/Ϛ!lO2ʲ|z5d0,)ue+^1IC"/`$b0 ߰i ~<6OŦ 89/Uoy>.`EBXssa/{26NIrrڒXKgY|nw w{rvxz>~CӂtnSjAIut \,j!s:_5^ 4+(}l wY+aܓ|)[4-= qHmɁPWWյh-`&gn#\HB-HPuOO":e1b\ѥk ~L[8vى8 4r@Jh+q# <٨)gr8x8H0xF(PFdSefBSu|*`5|[0CiKOIT뽡]"JZ}˵"j.W"o05-{moO ݳ|ݣgu OOӴ}CJJ!u45/q@ I~E#[*QoNU.46d>Η"oDBDdf2OW bA(G{^ݩ(C žS`PIZݓlpTK{aj*M O~U^s[Q sx;<- ɿaQu]jsR3Zc|OcCo1"Qi4aM F5笅c:U$SLei'&V@rOI))== ı`møfMdz8[FRsVةQdEBL>OKчS+I;Cz-<ƣaPw3 j%jde./Xf 9+CtaZq2+ӃA<u(Rxε'$t o =ˋa\=IL9?+3՘h5^6A@O&8I"I`E_55<$?[5@OsLjxgǶenӒ>U&DWxsFsnGٹÞSRgZױ_I*Y`+,H*UjT-ɲ$e˦$R)wݪK m$K.ň @\.kEr7{O!D=aOݕg0?,>?Ƕm_^H+V|h?m6Օ{=͋|x~I.쳪_>>͋wsw?6? #-"̲׀TY9pmvaCZ2qyǐ|wG{NZk*Y340R(P3 !z)PgͰ&QLU:ڧ9dy Z|ϡ\=T~Bc,r8rp&2ʑJ s,6puZC͇^#j)T+[ӌlOj{]pV*JAU ẻb .dc7jpPh+Ő~ ^dATK k4iU?03n/ߎE5}6-M>+l_;`Ũ-4Fʙ"+̚eGbhՆ3kXJX6j#Q1)<;xI$ (N9Ce+$ceSءqF{WaK!uNW=kCrW$ ^4{:2{iuZNj#)8_i;RD/_VL-4儨m^1f6¨#{ ϋf,k #U=sػ3Q"1}9hQ3XOږc(iҧ)ƮU;;$&9  M=mԃo0S>A/-ڏT 雜ڥIH"m#ZL6z#'d)D]C젨H )^mEt# {?K{+XxקW)>ư pf DgJ_/=^qybgHTP|ޞ3rTWԡB gfK걵oar+FU"S/<)9}o'wq6?ta  G/]RЉ((:ĥL6 M(pWGvd:g_&b̀>Ó=0k[U17Qε\<[B:*5[ @3O5 : g*2jkEn"\>}<w%)]HؖI&}[jBaHt8=~2O[NhArz|!9%q 5}qyHڜR@4jWHraD)L.L-W6il-@V "+' 9|$Pe`*nqAg9x<]ϑŠ\1ZDb]a?6O뻷Χ~+, i-l&ǠSJX6!ЄC= p"&4; n$p#R]5 Lwo>W{Bv:GyZbM{BJ ^rbX>U)} mAd#` z8Bu RJ@; 4]!'Ѣ߹n[Ω뭊O'o{?CFHõvph$6\=0`gϘ̮J^J.E2 ֛<-iYYQ:tqUeZ#g]`OXQfD_+rˠ%%NVk X fcMdrak'y16/tRo~HSeWegHK2$ BZoP]q"a!T CB3*RwrkkKk&&A~dˉ.' w)'dZ㹂7q/6NB]uZr|A\©qy,<Nx|0L>tZ}c' 'N+XLج7Jw.B$Ǹ`K][qL\k406QBsöW 4*Q9 9_{뫎+P{҈|dn2QFBsn㭼=ָvi7]˄_p5o}W ` FlS\E\7Gz^!5ʂ^O}xWsl172fn4WS'ϔ}] (+wn(m:jezFs8h6íqJRV4OI;qn^hp8nxءɧk| "vT>k^X} @?.`.e߹=*:P×CB%2la$fi9Ϋk'ກgz[G_.I23e׾\$݃ wm|=R&VH5Rۆ2BLjje$oϔcNĸQFqRa@8jrn e瓃`q汣֎A&R .[-ջBKWfȽxP(U(/4Wv/~_V/qUk WB'YAvJp4N󊝐N_,8᝾g'ྟ}q'rXB9*ɋ~\i!Z1i/:O,5aF u28|,nON u ~5&LA0'Xk6\9>Zi'5jN`nDh&g4J6d\|4Dc#hRŸ̽ڇcPJ)FSـN&*L<M0˟< cO6;<녣 T텱Daz7Qyi(cc1=ؐbe5%ē,zb_ 8lsh1  1 ڗT5VTN)+ZY֨f[22A;OQFk"i# B %&( T*X ɮ,/,_=~󝥀 [@aaCBB}OUA-+ _#}u+b6΄xܗApݒ #Vqbd0H)\2$qLcjPN17lenVf8]iLG׶Nݤy9Cboo Q"T2q!4Y[4JC-ei$hmo1qb"RBAaL0.i&bMphǧX,l1c+MWFڧY[p"ѕ]tW@h09Oé&r$U]RހX%DraoXaTq# }t)I[ V,֐%b"VR%:.'Iv*fۣzف) \6v5Xǰ&}e;zkf8y@=vWf,kFZpaYeR|*ɘl9gKZa |R0}9o*lJ3]Ǻ4ЬV9I$F\ǭpC^f- M\PVO x$F:p_nE@$5)8Plԧd:m-O8‹7It-ɤ3g%"M YUX4hK5,m[IݷzVqZVTUvu%W _'kgS T_e 1(<%c}9ތ|X o <4eZǧj\ڟLKiD!CG"E [PUyKH0DqvlTgۮeE nELۡ9~ϴAR+9n\cMqeg}_[jq CT̸3NP޿χl'RtǠO} }%gOa 9 9'C1\JpHHFk dpL!und5gQMl A)hl9FЀ`-#xdQ,[IȈP(8E\iRp& % ȳVaYabl湷ͼWVR6֒?ǡ⏊9=(V泷:~\7S:l84Ha( &18ݣ:3:S}`yҢbģ)q˦ivJ\HUxdB`mՖTH;uӾ *jiQ`Sm5ݍU}9UѲdC+6ءkˣj۰C4o ܄_Iy$Hx A% è.L g.1oC3y4mܼ/tl+/F?^|fƸ%n|f.;n+c8%l:'QX1|<^ >~ G5TUb2j>6N-PcsV,jnǚc26A] "a1mHV4ٞ|+N<);i)`#.ܘtoμrGxϯuX-Fp>q{ uI4ܲ$|L6. ňJhzmPbS>x7w䀴P)󫃲᥈c1eyb:6T[^Qk7p>(#ف6/~.x9 M.)s]H`_&d8bқl`pyq}>u8ˣ7pg#V궗yTڽmV wk縝Eǎ0LP==-DCe!o=ՙ# T"vp$erg`)hkM9xx(e^^N^_m,)`0Tth=JZog1 "n2l Y muȳ#::Ȭ) )z9<KJpK5< B8VMA^%a#Y,j"(kPs!E[l&kbAxaŮqkܢ2(D,F%:坫n,9%E`yn ƈ66}\QxU\y_.r GP)oF(EP,ߌ -;9j|&l2XƸ|ÏIk<Ot6̆ | }̲}^"4g@ .9go~ |Dު.ni%&dI 2͏s!b0>8yE/'h囲S;A}s^U{-,zRa sݾso3 4Mm*&ǰBrM^GhhsJ:{"# /q~ >דl" :gI0(M ig6|t27& I`âcf_ )cb wyR(Iba,9(6A8EnoӼ:I͆`Kf`& dp4y`LP 4|L?w S-' |/d}p(wV-f{b\ o۪%SB@FE??%7 `/mi_Wv;;+ -wwvoNov_UץPZ祈wz~՟E{[=DK$g:ۋ1]J%GkڼDbӔ͊.7#i|V<H))n\9n; g3ꕱ)Hhh9VKt'lcͶ"ņXvP#]:rCiw).u2Ǵ0$C1%fS'@nE&gL֣_;4 [xBA|S'( (6*AWlITi<'ӕAa#Drlf1^k6l]~Lmmڛ%9y͔U+¢7-&7HcZyɺ?8< FC0hǟBѰ{E| -Ko;|fX/!\Ҡ/k3F.wH;x*͉ƚPwMOkrhX@ Ǜ} -3=V8S\ju ld:5XeRäFhuOo-QtLXu`{V gs3m !K&LIP J&mf[_Öoi3b73ZQ Bko hd'i8OI&U9))4vyL[Ziw5ٌ9I(Yuϖ(_ZU!|R6別4rFkfjv6Vs[s邓e1|lfjUN͹,n U\+[vaK*˿"`@Mg v=Yd9JdI-Ǎ(려RUIiO $l>|AjObfH:;g 4b&B:dk8@/)m̪ѳHk5x֋R| k5dT:Qg Ϙ Q's}l}\r.Je3Lsf%aV8[rV)h Gim#h8'g: <3;AKSOErᔑ>tAZo?ޝ,nZpLV,_$8˛!w36XTv^o@^ߓAI|\m3HmhvF wI)r+"%=ϿkZZv{xuIj} 2~gq ᘙ!gP}H^ywggř0/k.^۲?A >>ƴofwp4Jlv^bׂ8_9 Mvm(3]9/<[g%;O㹃rF!@R)S0mb:N4а7ht#n+xvE`XEL-&"PY_AԄ&˝v*ovҌ-(]3%B3q =Qޜeے'' '2`̟V4S}vZ5ْ /CΪtد7| llML_* 5RGDP&pWZ̑"gggoSkkF \"s7_ -unU?)՘_w6fyM3U`Iw=]ѓ =hȾoem}FCysxfXt>o}? >*tLj$\ 2pe+%aр؈8؜vgymYT '7\񤚧^3ֿ6E  p=t߯&s؀Jf5c9ȋ,0?cM>-)8z|XDdChO'!LQD-H(\/n ^ E F Q^@dG8(ʂ tƚD]+I/c)tZ`k1UCpku 0{-ֲgؖڃ:)(\V2`bg¡`ԼL:礐̤tza)O{ZPpJzd'Q8 w}{ E2aSq-v骯cfk&rzKl2>~.o?ت`wYafna$B|1Bu+ISZ#.S `j 9,Ļr_Mbj2;ŌNe>v[,{c) Ɏֻ%o PbBSZ<|a>1xL`6M(B]Xy j?^}j+8J.Jr{2Y0ǚgq08,vt’{?eg"m3q“*;9ԁG rp!g-w0iqE}3c(Lqjl.ѽb[%=rfcIS98q\$b4B fǭwNqd;&զD^XO/mj B x1Nw`⻪ZMbIϱrIӎVJE YL #LkN@G`lU=S}p( ͫEe{6Pl"Sʰ X؎W9cW=Pxx'K[bD%"jdĞj|qHqu@ߎU-ZgjJU/ r@ק4, =hN_?8dLO?_~|f͋] K/Ow>}DcD?yl~J X+=EAp/030_0VޡP> (ϕ hDLޟ#uiUH0`9TK CG`l^7SLUtZB4 10[P.6݉ W@vߑKiS!1}|oPgcJ<׼Dik-C#4pySO({6gÆ/Bm@ Z+/zəH?6m#϶*> =_?~+a>h jCe"pc';P?Kh V-a[~U#b,]Cx/a,E0 q%r`"+)Nk8Tpls+%ȫJr`?_A] 3&VXӝlnӮgx^OEwzϧDZoB'%T+Ț#މ&2tV0ҧ-r΅K@]9=oeYhT D1/c- 0"`+'=AԕA{DB`l3<O-(Ye/EBӜ@:dǡ L2bDr!K^Z/ ;ƩRb84W0lYG;k%z(y ́Ua;ςkj\⤃{'#c Vc DA_f#wsSV'q1:S_|4:uʔ˟7`J3QhjhPRZ$W!vR` p0ucZ9em. wɽ4x߲l-* }uthgs>fl%s ]\MqoDL0Q#J,/ccGqJ|XS1d 0ArEB|ZPmh#$fɀŬVqַ_}XeߌMΔ‚\%rLD[Vj衡q71J ix%ݘ G7Y@[ eE~b 8*O\ ZBRT"QֲC@H f!N5,w-(&UZүy2%4Xw4Zaa#[*9pHgMlҲ+ VtP1ih)))=TIۥXdxc2u̘1 f4/ےS[t^@ )9*Z?Y5ɧ ,#0~OeʚFts58g.=å tUK I'YMx5").vpP-DZPwO/( = 60Eʿ?\C"X9;= 5TR LfsXi gW`{RKw[y2aԂ]NZd1QBa%:c밎ctlQb 5\=<>,hہ>aRC]$ʢU [8e  8iJdŠ,L57l)ֲ0gxC5K_[26]4 ] I^צ2bσwӻMV:u3p_>5xˤ.lՠ` >QlSUX"#9NR='!nC%6],er o~}b%u;<lίbCOfM1&3a~;bx@#x[D%"p ß?|7O 2 ?}1] wEDrlSW̔xfͨ.Ͳhڍ4 `(%=\@`y:I4ю$q精گO KzO>eVSC~LgI&u&xo1qT80I\rDܝA˗I2aXҴ"BR(5JpّLLvV}]rI12VI\LF }\@L:[,I1ʖ<8l`ߵ ӣYT |U7$gbL&^uWmfPgy: ;cEӬQmG` m3RH\ù eׯ4w”5eؽ٩_1 wݼ;4Zڷl[FL4x7 H͆e'^ݿ 0d>"g o1f*vH' @KbHF_>}$ˡe^N/y^.qЄlm`Q48T0͎E.&G3C_>3D# K ?ǫUE]V?ݯoio&֤O[R<|=>qa%g( q \ŴPqB[7 o~k[ }2gk.vN_ a~My zCf/I%Z<N- yc]"9v(EFi_$GўZ:8glt6kc.ǧx"5gHڲXR;TZ 3g}>uK0E(C Xԁ 2LDtIf-֔B|L|ҷLS7/]dY>PEC3 >.ƒB^Akǜc&q Ϸ>z:A$&;f}A%nsKjnc%"Y|HEt!0"^y Jַ؄rw2j]UB/;crFX. }Z%*pO4ًtL_ =4"pd%A܁V6r|aq)_$b|+m::UW;e_,1ї,/]RawBi=kqX꾠KɉmJc rB _V*0=!.QN}99̑=>MX5*.wr'j0t ^::Fz0Myc8`ZN[-hqA ͛9 d߼>O]zduuGF]ǑQȨt:M p$6a#(AJjc;p(%`tv?ŶZSl ]Yo#G+_vl}Ѓ `ΠD:I,RRRS 7-TYƊNjɯTgW?3a}(#>gWٮF ,70ֿ~B$J F&k']lUv9l.*wXp '۪eLZ\JIy芊m|zm~3Ѳ/)K"ShZiL2\OϧV>xv2a(?ylYK!̳rJb$}ܓqh+}/@%0'%k=kw帘NO15(ۿ_4Ghے%(;%kwc&/߯~O[/>xG(C <ع}8!y"q^SdNZm K_25 f):`Au u /_p%#3?[*Y$ ݸ@&`M+E^ _~}3m{=Cxx/'gb(vǝe$1UcE6ڭۺֻYgWf , Ѝ."FRF "Jڈכ-׸!7'ĜÍBګd]a@CzrOxtG:)GOŦ3?Ewr~Gr}[h^yVzISRbDK e4G)^(;+a$YEf qs(;?q6X BRm_^:Vk!1i@0A˱5.}M+BMh$gWMߜKrfQaY#حZ#^y{>Z RK(k!w'û_z WB6rEoh#VwVr(C}P\0: Xh"aѽ+B )jP h $YoupRR!4*moZT@x0V{ 2CqCM@eQ]FHUZ7ӷn3 6XVi?%RHzf?. C3Tr.l'>϶/)%lvz/dCw?%`ֳBء1ݵxoPGVwڕ2Cvh[yHEY6Sfgڕ ؎1fR4rqm?|XsjxAbMPi뵛CN\MփXoJ 6'S9*,6fb(v=*d۰[jӬOB j~(kɓF*{МW)Үk,&ͯT>O7FQI),15zjc|c7MJӦ&Ov["j{dŎUOV+~XY 0n鞞[s>hDzfQM$jNOU(еa75*Uʓ E5A!E@F,"k8GaH?qKǭ"ژapb dmJ~a=Dо.e۱Kӷ~d}!x. _|&2/̢i6irPa Ld& 5 eCF7%!W!hW7z?L BF1ʁef-DJL6,$KOcEPNڡKb8} :sLQth-W۫؀,bXm;!' Ȅ32jgLK}F5V "}aHnr3C<`dL"̃e~:Ipm02DŔ_.>ѮE%КBZkCbB{;ݯ^?tYPغ Kjū[J}7o[yS_s5(=XrH&UG^Emנ~-$'x1fC0 Ƅ_Kdu8ЦwNs}Ga FF|3t+ȿ2EX/_~zݯƣG9R<[>=򽟐B)@}Z}wYZk[ Fc3{=,B9` 近͊ U(A+JSiOlܜֻ^#%e^rP,QXK:i2Xy=M2Jx4c(2i( z_\CB3*(vث1 ӆQnyQicT+=E~<  5j<'lp;'rxcɔP`@x⛋'BaSi/բhѲBְô&բm"*Jr#%T+x},rFÔVh*]\62ƊKNe#*iMG ]+vҊUhPEUdO' UhWݺ|c0b*ơ[vޘ vrJ$Uha*{%ءTڭ9Tǵ0Uh|V{Y>/ˬey ipJ=o7 !(Llѿ?>d$8"S)rR4+o WB77]TlB˃zﭰB"JbܟY1JN@[4鏶ω'jIA Q`0"Q! gHi*y%ӆ ,c'rkEnodh!rF蚒VT*!2#LN'W.IEft ٍGbuED!ʳΦ#lR uX^B>$Ż$p$-O@* `'] #7\Y9%RF:| qa Rxa`jdȂNNk\QHCdT>J14V$JA s tw\OW,B6NfM!f6^ăTɸ @\4(Z#5 T`@rh] `IӠX&B\v""-=ۙ۴X, QhLʅШ]]w;w"J ]7}N`BS+4H EA  kI#oFx֧vIR 3&bîޅƒ1czoT$}-N)3WvhQb`s1_FL0zFb0 rK >S{_X gح=Ch[R[Hld>,L/_9>brIcE?Gv^t=Frc۠<*|9DV;Q[ 6[hV-~{`W|Ο xɗӁ$8){BL0 uV&1#fq~o^%\Ô.qʁ5^#Q.M6I ^"lOm-xE<)78w"h]n@ )հ-e[{n=53`h! dȵs"RȐhj Oa,l6+Ic\%h/{V1/˄b3}؇t/a0xqRڙ^"Gґ#_@#DCɪ#뒈szoZLF*V޾:H)}N~c4J#vQNY^Q[MصIZ'Hxt !} GBp|v'ӁC(4 NeY> qp] 3-IkrŚGGtޢ, t]U";{"ЎlGYSs%C F+){Z7`vh<25AEƏ<]LvhסV @RxTOnYdy,hHctuh7zA*hgэRCGZZ'Z:a5)3aB .H'q0wb8IRTjJ쎐d1Y-Rq6a 4y{Do+(UG/BQiSNP}dOl<_ǟ05|X}Kqσ:m"7UL軷 |5{߅Fyx-cb+3,K{*ׯz`:BIVc[h "C<@u<*Zlmg_;0Ygc_w FjPe;yP ]ºU/׌Ze_E.b;AX>-`3HKvt}QXY& TdsvуwF3L)I`-`5ROFbL͖DN}ѻJ,,ޖտ\(\O.w[z-DW~~C/ T{5`X*ƳYo&a¨~R 0:[tx`T-f?Kg  z&-FBʧ-!ѩuFTךQJˌ;a@L[X<$lEιDWTw1ȝIM"-S_~rPZ k;N1&ujs<͍Z/X,)*<`ѱҦsСfKtm5K(Ni'hQ]gOt WZtD,MdZ( ;+o<|ChZ݇b@"ϝP5\Unımu; wkg$zj_ 0 0SxX;9/4f2?2~SGXQ5Wj"R_?Lꮓy'9Ot NrVd^PÈOY;rl&m'9M[KH*4(8ymz{g./jIH.F#5/4oR/Z/m+'W&cMlSw?]h{ÙfBAiGS^3G`Ji)!SH8_B[8 w:4i0yϥ_sUн/"CA?SFnQ1VŝZ Yu19qIspoGF*MggEѾ̕F%8C%ҏQC1h!y88,rEщϺj1c,3d[PMh2|O& ͥ;8$Ras]>ǔ]IS1)f:HU`*g(`RAg]FU=[>Y[Z9$>_6?w`͏KLrm%Ŭz~3q~!蘲lY~UÆ;IOzpRjb6ѳr3%)2Tzc-:˫ A]5{IkT~3EFiŸL^Hj cR?6^-yF5%-q&eKNM2eK2ĐQ!oB5y120 GقЌ%<"p#+-W8\p0w:1ruer9N.Ʌ708hl'? Eo0^.ekN.KjP΁cPIg9%#E,b2Yv2+!xb\:6*ddBdh,)c(k57+$v997WC `+/f@1r]!MBbbƁUHsB-iֈ)q TIiRmTYǬx j!H HŶ )2]" =4.5TwcPBђaOY_oOFVsCw$I]-DZ폩҄~Z)_l(;(k_U#N2|S+h*:_S\!;E䅽RE [U'cQd)t4YZWX b8ωêx N&A<6ߺrz;kr?oP-x9Wvo_{An'h lavQW6mq 챸^\m!<39LeF&,.c@#{B5EДynoJ^XRDe(l.yP )|4_X@DYedG&S2j)Cr#/W`GGa ~/(]l+oķYXREoH\GĦPhlYrGx~yOl;TeiQ\S' LXVOkyѿ={|sAGĆ!;{ۻmzɌ=iJzR- G ԼJW6ߴ1e\Xܩ>V6'KI[TJRVCe IhƲ L5W \r<ָs~֩/uW2ȈQtYp̆} j[&Ict0t܉3p 1uUł5EJN~ƖP=( ]"kNDbwF^2&SiLɶ{#wo1%EZ nLz؜V/E7poy~"$ `IRv\^T{ypdc0mnÅ6ӹZQ1YDVj3$0j߹"`l}4ZXYzh;A<l)b&\ wh9ޯ9{ޏgWVxQETY!ME$Mɖ8'-NB]N=ڬVhyOرgՙz0.{UdE%[*ED*\`ٖSje1Q9#vV2 1Eʁ>jyK1[qڒ*.2/O($ \L5[g#UEiڒXn}7d-MXihV廡>~.gsCwmr ϋZOyS :6?>2mqɴÑL,) _e ;5Hdb̩*f m}L4Om3&/mGIihd~t(ǑgZV 1{Eڜf|>$Eml/Ήׅ/un'~bv)#X}Gܣk4f.jKIޞŽK5ݞ^QWx{kSǥV0?sީB*G(j:nCvOtkjNTkS;o[]vc=shG-(UA;#v6ri5tdב,$C´0)~`!;o5B:;ײ#o$*R,H !5>uv49~"T(e1?=^@IBfMшhYKNo'{W0O ڼՅ 6fV66*˛OU-$'b|aY)pm}[2!krZJ)Xs+[߆)ִuW`qgyeOO53i6[!.N_qv?7ãֻ7? >W&b](:-t` rOY-N4ݩqvvEE|MvΐgJ1Gd8 B}uB&K*S,5м\!dPpUj---nD_)u[I#,Z*bɇ&v8-Wrβvn7!K Yx3rd`d9WE5nɶvLwF-t]`8!>2XAQY[A /p!F|Տ%W0~e<9 < dْQ+8?7xcEJL&LtKEʋ!O$8WmJ[ʅ2;)(- B>_:Z LLS-k%GC[ \Wװq8Z]CcXpGvyѼ ?bQs2 ϧ_"(J;5M)Ց#rKlIh%*@ / KY_6 (άs\ތQdpKS=h 6Wkڍqx0\j^ ;8ŕ<(0-2+0_%;m(~5u6nmFDaۡC73lwW~O"HO Riƹ{8PhD O|FiwopkzQ8s-٦yn_P~Mqs=$rKP% f6HZ~Lt] f+mW[d{R{DxhGdHEr><U`8H,ؾ aGbk0-d'9#NjMmXSpBwJ !4exe[C%Xk!gb=ZIuo:$:ZђRXIMʦD>x毛-Km9G:A{xcuy8H]vX`/ngc:%{[k`CX`4\u3[f,_\8!s?< %$)߻~ xK~-3PR2ǿ_7ӻFp &Vq۩9A˵kMxmM)ZES 4r%9޺ǫo w)#N)n ,'r%3..SzhnmpxCTHHͻ7ǿorg4gu4UӴn6š{>sz϶0& 4D`5=k~0*nEњu{rV\̵}RPqkU6 !:AŠ1엤z~or_C`02_𛨝Q2ܣ(pPc0Ыj} EsGC$d?jK4쇌DK>_\,}CGX%%U2^7eJp$S\z&>E}™tejup9PMz&.V6U/"P( ۙ↦Zd)PL=jœ'1@@ޙGZaREZm÷V/Z/Tџ~^|,/R媘~zuip9_= ~㷓~Yh i-ygJ@b"xQh', _+؋s ^O)oJMiOw)c&ĵ[6h"lBMMܓG\m,r7Ɇgj Pm'ΤiF5J.gWyxz5ٷ⪳%B!i%F;g i-yrw뛬q[ͳ;bWc 8Ve$8p޻GZ/#ĽϷr3⒤\y2\0#V,o K ΑKBngL`<= ll`w+y:v۲YWl,тj}zۄ7/y 3WE1O?n+ 5g2NTIxֶlu(4åu+Ej.DʰOKC,kfo^s-Iccbn2o}H'Twkrcf=RᏝRK&T<8蒫R<eDkh}.)lb^GOgFy9ɨ`Im9M`+B^;˫vz$Syr%_rO0Rpx~4~<6$([pcQƽ>"U 8̒|0 KH:X^*jz:cJmJh1~_gFS-<O71ӼDG!FZ`˦jMLXo@;9Bsx>fii\*#mQT`bLjrjBMYBI?TBjt#<O꣯R 9.r@sgHh:k0t.hWg1BW:>zۉDq@gFŠ#uzU ha^o۷Eݤi-d8p2J -|myc}K;wnz|FGox#t18B"mN$c;["{vA-YLSc|L ؕz ,&9x[:D3_w.~E@>K<~>9&GGn@nI LqOޢ߈})$&){2qJ[0$W{&#>u~h{hs -jNY=CM=C;,z{gf$ۦRpnnV}2ݾ}l \'N:VvWC?&mYEmڽeJ1Ɣ" !t>rkjc2nxϿV#Vtxf5B^M__sLե@|xӏ s:bg̩~]>R}l<1UVޚL CLvs$Uײd^tܻG?{w6BU6l#}{mo4uC#BWdf~q!_F(˳^j [f2`>N6_sbs-*_\G5??Xud]Fn]+Kmֺy)[õwmmK~\`-dq^8{e(GmIvuC∶, 9TUUuwUȶfӆ'{ _Y-;jD{X&䁼!\r/@;B"6y$MhG[CC%nlEZc<7=iw|vv펋VDPxյ~<tBj .P;Gl/K#:MS1כE/nvs~?:S5Dz`Г1Pm>fz.wO&yrUTKZ>HvSmzI2Џ4l-z ] Xk$5h}FAclOmC%Sd7TMIOge]VoN-\*0,{rKIRhd+M%Àn @d ?xj =D1ϋxw _%hW򓼚"nFvUo5NH~,qU6 MdJ#Ф]*) SjMׄX3ljJ5jPDhhނ2O'`PU}Nqؙ!` TlN٠"A$ˆFWljƁg`iؔw~6B{Am'gdNb:gz =W b-%eJpeH|U[*=СG0IFLfR`t8!pH β>+w[%hEem%blr@Z[lzΖ[|RޠmC%>&SXWP#3ϑe4Sت۬ҔVSĪdSng3eOFR%c>V?ZvTjB@:"Xy@0T %բD RhhgZ f$~fE%\{g@v(!\ӸtDgTN}cJ.䖅#uc-r#Fro3댷|+"JqNk)bMPh3H/Qwv]d: ߉& Fx4y-d'Gp%Eи\"U'sh\F>EFߡe3QS^V }`-+_Tة*d"Ee~Y·^uR p !SL0/%moc̓XznW}|%!!%)e8{_\ UF_(LV?k -f4x:T}Sj$ >,ը֫2b+X:KH~kPhr雑O!zIoXXdW! &p-Mv-=fN=IW ޤj +*AW4:͠V9!x?jpym>Xk[3Z7<֊ 7NeT8fnY#tv~Z `GԜۍ,X]Fvuɓ12 T-adׂ'3L?[xY{Ӂߓߛ/>W͹ރo6es9};AIl4҇p,.ؕO7` ua`7bܼ_gg,UZB*&^;22D7ŖGQ]y\O*l3[Ne~w~Oוw=E?qg)-yR#iR )6<6H9*aDW^)0ʙ ;Z?Nrr/>ai#Sh^}t[Yһzۙ6=i_s= l%"hjOadž1i?;~̑Lu9(rML,JRzguPơY7(!7$ 8;mv@o,^"#$¤}Ke\Yw.s6d5~8^ $ P E>(3fh)#$ 16,|anE hhBt{N2cEuK6b2So# +pw 3l|c[&W ۚd*kvO=d'rbUÒ"2*dqn^}I.)I|#ăg)v+ ח< 7V, ވ{An޳c_f8MKrcJl3>nBfn^7<a"A  rz)%yV5Ҩ3<}b9Uxh+ypu=^Vc[9qTur^>Z6:v/>ﶢ1´\F<J'MiS!HV;\gpZ'ww&'bb]a@h2R({ؕ LZ 01ݷ-SQvJî/y:vĊ)pLmW78$O4îiY!8 U!MP(B/0taDVv7+lrɶ=lh\zN6|Ѳļ3z\*{1Y~ %K [vfvک:ܨbBT*RD֮ov-n y\ݏNM z9Gq8P}J%dKqg%mGo?E|wwK)\/OI-7/r1X/ tZfq̹g%t](*]:g @(vG!f<#_Nk?*,Kw.ϓ?>ڗj\joM视axbroV܎p~3 nz [64e8XUZ\—/]5! M'eÁ,IeW×?͡?ӛr]G1T*_õ_C9|6Ֆr!AiiBrjwa^|dr`*NE\*o7DkN3}G HaI~:$x@mI? vY{ ,f-#7%/N"fbGh nY;dCx}1i^EZGћBZSM9܇5۱2]#ݶ>ČgD&~ݭ9;>Of6Ypj1nb{E,.U8?ɮYH?MZ[FK8#˿BΤp<2k#vfENE)#}$%VivlUw?D+Vej|H$z@".1PcPNci eK ~c 6AaZ,ǹܯ>EQ^ƌkst.Ֆ"KL%6Ida fVXThFHK9$] /&څ.U SZf޲EȵUE-ǕodhR F1 u^.0UD} rH\Ț; w"ZvAYMV%AJ$gqUt9Kf' ؗ1 GRUwUSY~ 8!/lĪ^xm59NDkфP9>ّ)` ~2xbd4x%ǹZFA놯9N~!Nf!~}FǃX!ŴZd:f  2 } x9ؕs>HrV0#*@F>imwbSc?]euj 9grt7C8!}7:[b#*q*wDkt^r-X0<艜N)bxЄ^Xc9΍cc̱v+YsCX$ڻGk'zhGξINt5rg?Y=üKu`(8Jkh%*LQ.M -S*kU ,IEɝ\&)̗V4Eʜy`4!SSd2տOޗ>9lN>L'bܭ1Wz|0|wܶ0ի8C3~]& ?}-gy>A'ؕ؅?n6,Z :M? U~)K0)=|NQ,"WaӥX嵠61 њ")l9q"a2~ylG+HP_zJA2-4o !I5ks{c`\b҅VvˉuinPEsz3fZnq]\+qYڳdO*Dp:)![K-Ȗ͡/@H-dņjи_\_0*"w" M(ۛQ'UgQ4˙v,JF l6buJ4F*){O5 (NS) lW03-q1EcH4\km4AGB#qAXR3GQlԤ% _8#3h,`<@3L}]XzDo EEulSƷ-:pFQ ½+]ca<ÑI)LT vXEeN")1I4$M+ pdTD\9X3(?;r\t{п }N\&o?I$^y`J%5t4?Z؇;vV~QrTw.R9Eow׿)*z* Ĝ z+W 4 qbNqq+Aџv/܀ #i@b)JB3ÿܗkAۧEKp7^|s$(㓅Mfs|t9> 4e~{_ 5b]@@g(| "!#]AN9uȡ"46jAkF(D] UI"76I~$ g8Z'x]oflNz9m7RL>)j%]S~ rbo XG`֣h*ex:m%ywQ`yP<=FNhK]lU0*`ūToUKlrdjd]96ͨlmۆ]IW˷9U20ք( c̐T[ ޼+MY4hOAۇJd0o>fw><?7Q'^pq\'2w?` Fu W}oVB9[B'P8L^a ϴ IcfM:%+`w33 /eH/\nl?ퟜz>&ݹxY\6 FpѾ K>(7w773%AX|cauڤRT 9-H^:B UO6[1 r4ԱE,n=uQXG.:l=>"Q\9Ac?fX6ZH6|qջq6/'͐ics&&(jXZCG \}sIQhf>*W^iWAdeq+ah'31-鉕-BdOm$" m)ֻip֩4aHN׫C x K:d@`0T-d:ޔIT|Da=erI@bòq+uBY)1: m} v9zQe9YҨ"^p^7\}5FThO,Tv甆Bj"bEX ^TLZ{??|}`r<0:%XSR엓A=O;HmAzيFWL_V%[&x%~cQHP0iy1(O--o樴Ƣ5 4By% 2i SiwbTqJ~ߕ ??xlXTU${4!0c/jAVFRDTTj 7䁁r!mt`bZi—^( 5WVumb׆Ֆ 8dW(>OH~a(N wDkKl0V#Vh" T$MɑK)?'Ԋ ?:IoryM['ߗqۄ\0C2d ]l1cYh^[hoےSjM+Pљ&W<+eU?ь1z6 i=fG "+9&e^YQ0C|a2uBT2KʧH sl+4޷MzjpEe SA(?c)hShGC:fwCm}dVNdal-ܼU3J$ǪSɞC{7DzTp딧ugȉԲ1QN NZ%v9Pb iDOH-CF 형:5L[xO>ȇ0AYffsNF: AY! ^&6*ȈA^α[-=4]<&>!yZH~,/'f4l ]H5{ 6߾=PDgߩn03b RQb" aD!K=,H #H0cAAf͵>J ZO4<3%;M+@04brĂ06',k4\Tn>n}n_9]p$7 gVhQ0E*2r땈!Fa@26T-[ ͫ4J\9)6`j525;m^-T&H;]b bP ՆEIr6L9 m'("92? wcllܛcA;jP)IX QD)*oחs!X2@Ԅq +<'F{롘!9,y)UoDS"T\#XH&4-)èv9$ZAPV@!Yyʴm(0]%EG  8`@j#G.8C%M2œM0RۑmYfec#-  EA5p0t ԧ+Jvd"5+c1`~tug" :khg%[:WeA$"u%w!3q,$֖%P{a|PMy^a@c?^~z ƕk5 o?.CD9Ž!'F a0/FGpCyz>Ls61UݽWTt<~#&)RXp %kU_4MP[%׼d!\㽽xsr(kp!0J}h4.((VL̩,S2Ĝr06ދ:,2Ic<] +4yY\BZ-_\.EBWȕ✨ @5Gt|BsU]Džgⲁǚ ,-gXl&-~,{68]f<949)aF t4(XG)$^SafSqi3ĉMV( iTckZs6Q zKY UDS9/X:FcDl1X)N52֍Ք{ӱ&ʹWA(Œ;yjXB\B'S22BĔFbk)p'fnOԨ,4#2?%'F0I|2߄L% E)L0 uB4i"I;>ϨյqSOUy"Հ5cBMeQWj_G)^/rTѦ"~7ZaVT#4n驪Uxr w=4eJ젷UjSa@!a$I}ݤIW`UY iOCy ѿ:? _ "Άw<~BiBH(@Q򠱉u:9#TwXǜZ0y8 ?A¨??W &6hO e "ˋlʰW++/)_ @]qެ s MEi?;=Di֟4p/NY/rh|ܝ3% 4iDc1+;< *^4N%auw5/0θ6Y#=a'>F:mAqt&ڇ0aHI881pO^ GX+PE>KE/*_h pۀ+5)ٙZYj;B,[\ep&Ij.)ceדߙY`^8N~myhTۇg9ɳ4Ӟ`L{R @zvJ9pL&Q00ҚX$66N5u4B$2]#˴sߓ:{9HACPWWpNa|<p,ۋKŘ=j>I$~N1i~nD*5(ҫڬEu x=GAch0+ijd<]fѦ'v\c#kvfyՎtes]{K>TZUJxׅSfcBԽ 9߸\J&Z'pqTQ'#9KK(uL0RkVӱRx4tUc0oDFACΞ,sQ:;"@xlgv~RWWMeC=5a5A;AgbӅD:sljvRTbS@ՒƠqa0ep"Yf8J!VAql;* #v'I*q+<ܝΛBل4ߴ rLp+&~~]z7 #iz!DeW}/!<s%nLA*^GՕmv6Ͷiz|P)n3bD`;+c<1TAR5Fe/L5F`TPV́p"Xb'xbײ?54NME*H)T c)'9!bb,Mx8C.㍡ܞ@Pz~`B껴"ζy'`^9؛< y6vVL+̵`o8`/wz [B3σΠ,xK|%qwPbv󿻝շ/y[Z\fώ:eɒLlB3P ҔWF.< uo9-'m&Zoe}s vB3N^UZ10eaϚ_cϿbǻȪEsF)b v^s.>I<3W*/d 9.x_ƣk08-z]ZH .,Jq bi0 0J}17P-`I a % XD1,npv(aL FJTJ|0D 7Up&x|lc,ā!w{4]+7&k/nv 鬑ԫC%jLI1B`R7B~ip |PmDWȀz@hp۪ʽV]qv89@X`],4lDzGdpCs@AVY܆6B7S"ZeQvB6 gv$ ·}w$,2ʯE6P;Q4ԟޱvO q0{u ]Ŵ,}bG{X ?}"sh^ˤLE#b1C(N9vV3g]Sgob,Voa޺m߭$m v8;D am3p9͡Z|iS.0<'Zmmެ oTvU iWjJVMMƙ#Ɠ;Wb1V(1E 9ŔC O"5&Zix9 m}?'zQ3Wϯb} :e~|kŠItb-Pgj7o°F4m嶪໷dIaκHq+i֗#Phўvn);otڽP[]!EA"^#Cl00{Xp1Hr-` .њdEcw寑2'OOgEJVM<kտAߵ$^6 `mR_{}JәZg ޵V39KIή'to?Vk7Qz.Ȋo<`M&ǵF!qq/#&QIEMų-f8pbJh:YgY uXKѧH* t£2<R *A2^xZ~`o^~eC QJ-K z€ 苗|fm`ћXMyt]OgP4|CYn/XVSZMuuj+l&$p'ZD/PNQ٤ryշbF967ы*s=l#;ki'OtO_Kdg.]=[>v_fx8_ 1|Qd;{Oܧl.`PE6Y+t v.Kg׋ r-XV7T~~.6G^d7^j;*AĽ_ťȜzA֟zsrHoY\ك)s܏KV"#ѿ:?[j2\ԕGr0]ȍWW! M'vG3=Y(껙^\z=ŷ?"onYv>qWǓoǴ^z3|dX'ʅo/SFTyKo_QgW O.u6~3Q]x!}>I3_RLm!œNbf`򵠪2򷏴[[l_Uk˷|koo <'j9uTZM"$8 "1u0m!ydԠo˴QRJrB\]~efZ|}ehaeՈ+] G5q>LIT͎ۚr4 z~Qwyiݤc<;xC{l>; @̈́# P=w<Ҁr c}x&-j<=lٛ5j Y͕FS>Ro/N.}2+}_>>Kώer%CڦŠQ%'tE7`bյI^k"UJwXU׋'r,-\Ig/y,HF"E-sqD"Jє"^6`YL;K?(Ut UtIAөv%{$E;H@+!9B=FH[JB&97!|;{ I2s:8G¬N43WCQz׷"`i1STQO(mص"#a&v}@ؓlnb&]U:ݰm268jHb\DcCcîU yw*e4.SR7UqF iE-{ zPI*2֑F}kQ?:4n֨o͑7z}g&>%hi"~ >ixDiӔQ ZJXPZ~&ؒҀ6)!~ZPq0֖lm-lI?[[%[[<כ9QR.IH"pP8qO̔TsZSUɂw*Ҁ4Jw^T[K$$[K$[KydipiUPN"ic1HY"ҢО$|-%jc BV 橬JKH @nm֖–>nm֖lmZ;JNT34XL*SbDޕFr#"e1W5A5†yZֶZ}Y%NYY$ nRVǸHFhM2d|I'^}ّPQD#W!\H#yTcL}l.%ZR浨S+^FS-8tt˲0D2(x],4i19H>up[/V6V;ƹZU}+3G׷65?yɿ~wl+?۷FeetxqO-]8~=Bi(F[AdJ(b*R(ZH@dϏ%XB)3$@4!)|sQh/HA\̟=LiㅩŨLdQ4UrTA$3ف2~nLlҕ(lA1 6$ % E*M`-s`Ge@@= E x(Yt6(Q0-W1_ (,A(nއV?R]^ķ(uYGfRs%ΟPoo_tEDox/߿}X~3LZyx&>]cBHh:q5qyӏ_wzyu=w&7Gݟ#!vf?s毮Cg>Oq\T(cA_ժ0҂Hx.e zЈ5Q\#ՍNͰY4@+zOBJ;"tj9.fgMo N6ZE a#p\hn+]No{)PE/߿~d!;<11GѤ;ݩW!Y 3(5=YYP{(0z]>BDA#et$9R:)_lzx{[o:U"3 ;ort9~l$>OHixwI"iOʟF1<+7Z=55f U͟nfR!ہU$>>@lFX=ص} ¹} -rc B@0JB(9 )ŨR}pAWYXgsRe',6OUXc@NO矞X~f۟ n=@['3-j+X͡:1wq^oŷ7r?\8۸0ῌ>?_/}wvoV;:Mg7B;0?("<$; *?P5u՗j'ټYG\wC^uyO@t# z@O99x> B%ɍ_RV3wgb^O0m_WW{Cg7`r`|Q$J4DςsRPPEr`1iǤ4Xn\ [X}g=vMި1^yҝ8N)X:Y%l<rAklpXoaGɹm)(A8BThHh&*[f3K'1ˉ8VVjBc^ucQ `%Aid9]rhF-%S1&%4}f]zm=9r1u`HiD#KwD`4iJˢgE7\ PS^{C)hrܷ$Y9GVWY瞼m n<:n.닛&5GO/޾yҌoq͇Ϯ?ۣtώݺ8/TpTn鼝g>Ol=QQwՒV+!{H4eJ E#B֚0r}V)^o|usv=wɒ9އhB0J)nrNNS&ߜOfV4`Jly;%dq:s)sh6`RVglD9a35FaGַ$Lf-3^H9"k8?BN~2VSgs 苣Քe 2$J99\=7ue+kt={"Zbu`09& ,4OSx-0jC"d/$(f e*#39+ #n\U'Ag.g00r«.sCb6B.YF6{ E2V]BbCʑR˂r*k6ʳ. Ef Oo,#U5Q=b_x<-}3Ϥ[B#vh֏ m[U;1G&$#pr ^K _" Kި2e^1&;y.͖@tLm <T^FZ]Ȋ!rhVq1JIXKFQY KѠ^n/O#Eaݚ SJbEgV4zaEzHRڮYuZgYucdsN uьXTq6=ީ,O(ao[WQZ2XnBK+{PnD=`zTQԫGXg2E ؑ=GӨiJ rFپԻv;u>+Z<4[|€]\';#'rE'g$yld$Ǘ)0cAAQoB y p~zl1n0 q­pbPfy^h{ `8f߀ qSTy LaK=qhZvCM= rXIz% ;@ JQ"'vK:%$iYpE.,{yCP ,ˡ5^9Keb Cƚ߲WaEXK:-Yq(Vh4=;," 1H );&ǂj].XSIsP;#AaVxنW'XIsl_)%QO8;}Os{cF:d p_{E 4 ^C<Ӎ_:k .(4q@7D )z0#Ьp+Z`&+h4Z-3Ut-^Bjdqk/ %تʔQ~alDHQacۥǵIk5mWq=0 f?'pn{RlJdXgUg3RC 2Ο@ 9uVʟ\~KS֗質w  2)iS^f}$SqDG8F={0r_'$ڵ u;_`3cB)3Ɔ)P Ì8Y/!eASB lF)eRbT똕fuό=_67#wy75,kxX6n:1R)R]#>Ӽ+ bCmN;42C4G^Lpͺ)+^I ؀=6ҙuN&տpk_F,½m>j~]̝aıf# @).0: 2S YQJZ:ǖUI.*娖P(Zc?jAc~1#YdQrCSyK.d̚*Q,3x8x7 ɭg`^? MՇ́CQhZ=S3hkLm[< #>H!XWn'3؁ E["PWd2 pA hXS]-̻{vMy:n=*Q;i-tZ'XDAM5 cv}5n6/vMJ7581F8joz6IF/mԛeir6|:̙$LX:s\Dǜ'iኰ 7벅 p%a>0/ـ좄c(!]|IوAZQ !$X@$hWnZY/;8lW'D{vPF ԼΣF.cbvغɑ?b.yt2{lNQ]J~E:o! ?`DOM!_bV׏;Ο.xrMy-t}çgK7=Tډ\ xz} {2$;/.>=v;0⮋oյ-oyx]!~݌/KZ$ ؂acʷ6b>w)Ɩ2UjbX x,ۻiػsկDO@BًSv:,=ip,;xCz 8%`6gCqp 7Qz3 ^@biRXaSNR $%fʲ,e4`g4"O.ϿQo{w`ȨLM'1m@C3^hQ!ۿh8ڌ߃o>jn8D.\zV%E\|Mya.~TjX[-ґ_&U7~j˳y\}.G]s~cj77gT58o߽; 8Tӕ&yFm/| MW]d[ZkE~2XЯinYڠZwpՄ}Êy41%l]g[{`ܢGFk6L~˙>;)K5b_IBkYmxӃfsپMMo/͕v^Kq0.Q"uRN 2`I"H2ǓĬM.bUdȈb 9`r˂^(*5}n~i$"[ܥb,?ά2!TǠ5Ĕhs56ul0Jn0j?=ioGЗ`r݇}"g؈̇M )kkHjd0}_)EjVp,]ޫw|{G4;w1 =)xtrSʈWRq[@Q&*=GǠIĈ[ JS+]&=L1'ٝ*#\*PLM! ` lkb.pkŠQm! XN9%H!?QJSbLsRY_6qW[>#/eJ:^ ^ĖVff0<>ʫ,?F*]dקq{r3jV > ؍s2-pi5[^I1DAo$˓qh$[hG}-'NQק.A ƀ|~lKx`Z !q3iDZE)N\l[Vnb{ɮ_-f_<5v]s fB߬A\YᢈXl2%Gw^5*8TtqG^x#Hs%F9Oh*1jqx?w'^269>f]%R5Rҽ3d~@gj[YٱB ݫF_ 0WmǼ6mŹշ̒BSuBX&Nb?a X޶o[}ɠi;e<ި.kF1v1."h:ơ9nnzx`f[}{ŃźZaK 5`yL;;ǡZ=8o6M-ݿVa] Q1'R \ 磴]l ]u6Sa QMUZ@)};Wm B|rc^hsfC42,T3 #+uDIl83#E_#V *ZLcLlY<`ה*2Kvj(M) hBz=3[' ƜEVPLȭֈ!, hB*߿7f,uרwhX'qvs5Y'ɫ $@hv?nfI_dwI>7w𳻹R-kZ0zӊ'wT;Ozo|hCT8z:.agg>]ʠ^jGO{1 _<?TyXPY:?8/7[jLV,)|G+6ҼwVzlI:xsi\;"w+A锾 wӊФ&gnn]PDh*\IEtxBC+L`X(hc>W79WJi_=l<;LsAkRU#20 4jCtOBvCS W3%@)%/0+l3U%Gm|Ppفܽp9%= $7ɦ.!֦GK&Ns_ 猁&jL*6eکf]dNs",*BGcgQ #f5Z,AK/q@Fݑ/u{ $|zubr<mK1SPnSϢf8h mh -tEg'EĆv0P&e3Gi*$!)B(,q_KFo4NE 2gHLl˩O |B þl&U4&;X!75Bj"JWHDuH$&|B$Iy'>7KQw{wveyE# 7ԗE$}l ¯A颖Y UN;> WUS`=ɷ>WT-\tNf!EB |-(5dk "=ɰG؂ZX(.o`b\Ga]M:{{ysk? |-(axʯLUҮODSiJ6AvSi㠐giNKy ;߂q8#V%ј"#V謪KχcU u&򣓛"Vcw "bL˽.C%ccSU+C"2=~Լ`7Xm#R>;B4c{l.8Fwu)?n!QO0(>H}EUYgY1bO@ %AbC$SS&RxjSs;,32NI/tFLtTg{,&h)zKh,|bG(.2' c7& m]e[%|EU\[q9,:6/ί+Kߩc|EҞamA@hJx! pZ8?wKe6_Jy^'4P^>S.*l EQY[BQ-:(Z wkU*.**f qRK\.Ԯa NSSg @v(ʁ.2*f  I!KfT\  {bF.i:&Xc*RzU]S.a.' w;Y_kJ)nSӛ870Zl &pϱ[ߙږ,wC].Fp76(&)cqgc&S-c|ЍNWzDZdi-/Akxqjf@24b ?ޛ3gɜ9KY2gnvq3%8ɸ p 0ŜQ^)ň,-(RQW=f"P/4x6e˿y/z=Yd5WLJVCuEfi'ʳ}d4:& eED:5hd*J1 P;6!Zz׊ *HCƎgsJ(_~"?}sD"7I\$@S*ҒPe]PޡP%;}OGi2_3uOWr8;;sP>`d,aŢlܟr A%/ $X,^Q6̆h#Z@% Zb)5׃("r2%f"h_zz98e0k_~Ӡ!` [1f XeUhb*֋O4u6!O82Yđ"ނZcC,1$ x)Њ~*{.zr%o=4p~;K>#o^ X*ߍ,wJ%\ 7j *}8_x\ݞDi=<|ۣI+;ݲ@~?^/ԿsiV)߹Y*ޒ6moDLKR>|{ruB1 =7"1RFN6sɫ1ABI ·o) v20NIF@1J ('d*ZВ]#WԲd`m8}k뫶gLX$ӆwgL<1ϓ"c"{CY`=A[`˛~Lx f"D-۫!R!@`f7'XvDw06 =7c" y|+v1 #d WfИis F+pO'RO|oǠLŠ` 5`b;Y۷ѕF#`t`iN> C㟅?)! y~)/oo{T-M1_Wt'@(cm}սiUoԂEwOU+WO¶ lVSKW59U*Ae+fr><\li{}#D* U/> Njу.7_RfN{eNs#r\91O}Z, ^9wy*|;ښr4\ۛż $Dc{49x/Fo$T:g _\σpK-#Te<<2$CdD`"9L8tk/%[X%0.i_#_ Z e#pr5Q̴]H,$1#\9PF[ [TD"7ذNQABj# Ha)6pރ~Z6K_amFm:™@_/|[;TlF&/kvBK87<`Vg 3pm1/</-<1+!sd&H5p: y{P3y22/+psDae&)b;P-<"aP[Z]ޏ$to_l}x ͗+NzɛcGn'7UM&4JHP I BUoBhU*vE 'ÇBq9| s:BfU=O$0٥~h\^ g䆙U_*繞#GisGIߎU %%ҷow#TCXoO<8i؋96x?$VQ*1LW1!LR_;c8vi`t&@blКR|3cj'^<TKԺga>f#WTe!劙 )I`)+33-KiBZ4D#BH4Si\;rS9O7+ J@ 2HS #,edR+207Lg@ !(I$bYWj(ePL4ODN9YaxSyd60Y;=޽oWy>ao8Qpfwf|898OQiE"٧/X1|SNM8D aT|͉!l%<(śMuZS9[ãb&TjzʬR{ MnzN-3Qw=M%eY)6Raj2w-!MSRLatbv)ok6N|}2COf}Ub1OcGq|;unwt!vLd7 v@* ̨.,rN{kB je7j"Α^U _*rID5D>%mުiM> !\K 2 ߻{C޽QhH xqfܪoC!dP&[# 4VSK6jPB*÷0 UIgyJNg~J186_P[^[.nD3ِU+sʔ:-ђQ*X}=}v,l ^1d<&q٬;f2W\EThͻbͥ M+As$tAjB nvرi5K^}?o ޤҟ;b+S=d^.φ]_OCQ1Y[,~c/'K1X$OMNo@f&b4F4S %*{DGG|B*6e||*(YJƪMX <c婧9lߝwH'٪ٳx͈c?-|5zzZZQu-Zo %.O^o^]'|?44#P9fYy6eCL(d(sP)AKoW+}J͕m:_c<1E{֢k3ns~ asm{Tt"xLsEsZ|~F" z7d q:5zr9L+2}RWLR/2KI:_6vBVb:(!賮B7g׆τ"m' ˸*ƪpdRjxj 4I!qI"Up)APc!Zg┊0n"MnbSDb X7RO1)`:0$D,b0A$8}gRh;b)5$rS{QPL'-@r={`>հVW(d 5a&H[Ԥaa2N'9hܔ]\c# X[ Swu& )TIIjC"#AhJJ)^YDFT$}AZQ7'TCe5fg-<˴"Gb#nXt[o͞.촷֩R˅J[) J%:8A)֗B*QR.f5#0*}R]2[[ɲJΦ0y=gl$qW5vze]ޡI$-^o~&/-}Fq!tMEyq8ԙǫ%e73Xl؛\]/KA͞7rW4#!_/itntEʃ*:F֞1-5z;v &vkBBr )`c,v A% vk-z]ZY5 !_Ȕ{ί]j7";ByPEtBǨZ\$ϭ݂ ݚ\Dd ίqh7N脎QGn0%L ݚ\Ddj7Zܢv A vksKPz9-Ъڭ EtKXE=~V\N ^/92X/eE^>R㚗ngּ:þdOf#z:.F%0Wt6-[VVYnק E̾qp>xl]`Ejm’~E&$ $Cmv/#5?AhnmրLrq-zq{b/[| }9qy ^xKP*64xLk!!y||50卺xk8kZ8^S %`юz~ю۠ 8-W;ej?BE)뺔!.aAC-` W-cA@9nPlH*<ෆMuyBkfLIn83#5AY-z4\5ERxꎎ|( y($f?m/ _㒑ޓmdWP]aw1ؙ'(nuR }HTRS)(vV[)U3sTQ2~7=*C[̼W6|Bex9IxA3{GÐ(bIhscˇu΃$2Q4BDB Љ@.M4RfR? !%I%ArH,\mA+|a!+VѬn—4j':Eo((((LKUn*"yǯ3| R(t߮P**&kW?WvZ)M[KpG a@2zt=>^ a đސ#f@E1e**((simX  EQ 8pq(J,(,|q&6C;^=\:t/ӻ]Lj1Qxb`6-0Oi\&\sK $e l[`GwMy'K/߰s z٪Ver[2/!;ߪd59aXأ FD! Z!pTDŽǶ2 cA~" 0!d }_U l;doM?Znk80ẃ.w#8[ނ NH-֥~})K  ۈGᔗ@# dj@l.qY*Ȟ;;qي%ğfRE8Qa!h$B*NPQ,Ƅ .PLJB9RvpC%Ђ /!Cb (x!16epBb#ݗ:!16.%$zH`xMnp A+?C22*G"rf b9"82$y&k4RrM@3Ud8 )Fܘ%ȻP&PH5>JuzkT:94oW$V&M`\,HbcoWe~Gw1OJrfL_>ji꺎]ΐja_߽}wv\pwrOJ7[My'5}@i2Xd?FtخuqT{1dAwcqaM:qv_5hP֯ɧ=/E yנY-u Jc依/W-GnwLZP=׋ Tw<=4WP:'hmۯ4QS^<5g^я!z#dV٨+cUCt~[!y~EAKϮb>Iu/;vGw|DvwlI~j >lEAx?&FTr NEҚ['\lG[q6Zk6+: %~>fΏXLoYq}Wߏ_4UM͝-'jl}ۙ~! 9Иw0ꝵ9 CՄ/_Busc_;Mؾyh@s6SG´ `ZѽLwWqktQllYZG(t/q[":"߃?A9M+h &D{Ns)^҇~+0oMJ'89Ԣr;׉b]9PϞx%>$Rˋ9ԙУ6 tx9#ZX@Q͸Zio>F2w̟ׄ5N?A/DhH#cBP &&R&,DEqLIbR2LEbp.AC480M@ocڹcDuEfOLƀڶt 170C0;7E^р;j#>m;2.ҒΙEΈl>_^b촷lQl9`:.trn%#IFdvݗ:(W{iʰmC*B ̒=n}Kh*zQNRF|FaުPDM7ٵ`)1`8R3WYj<^ԋ+UþP}9PH@){/!C]cP,S^Ύ/z#md&fsή91LǙ%duR]gw6eBB= n_u?'Ēx4`ߵ\0rz ~,5YRe:zߤUf7"r55 {ˡPLXQɭT7ڑ$H޺PZ`D)z.v.5`>SÀNlX=Ǟu?Y9i#@1N͓,oi`Πk:>R` O0Fr7[ɔ9Q$-d u=q hOOɘ8γîO}(1/Ӟ˜Njv?rΡV'ʽ M& df:}r6`sNKM%viBHDYSB:ѳ_$Pqb8p: pwLxǙ&= Ș`im|06m!sm!Q\jy"^#P4Ol"TUn<"2jvutAu-VDW:8>_Vf#ǩ̓ڿx3T'T]0!' t%L+ΎֶkȞSGB W+yZ,]=1*S&r $CcJ>b"Wj/#(!pOV YƷ[Z)Q R)2[Yų(^K3QׇD$Uj|~L-@ESO;ӇXwS ,UvNN9'ޒZ<ytDn`lysgڛ\i=ݫv{r17bcۄF &` òpfn ) Jp6.)9\(}gwo`İE]|i& J%o%m9T?ZEK<{l"*3_ b}.{N5-i5G6sX͑A ीj[ä|lH*vbZ %1,yQ3FEphFgcPHq{߆CN{_ri|Zr9=TzU溩XB*z(d,ё |}3j 1C 3$֞]ftaOs ZHmdPtrA0dese{e( C}'x@cH?aPgdbLҰlj5Zb Rxaҝ(כu:37M͊Ny3Z@(g$}'ER@[(9IZ Wh8SN9=ܹDqS혩m5c˳W;べ@kMҽaFt f˓-B!ijqNTJ^(ѕBMM>AаRdaJ2 σHFcGz3Bs/[ƫL̞Z`!{<5*v[O1=[ǿR0!Eք@fR³ľ $$=˹M)5&zj(QM(@!QGƟ龣8%_VO-0ymlZ[CThDQĊ _9M fBIvxE0#)髝1l !_ Xb:_Ւ4Q~Sk@d\ ^YcH6[эc z h,NYIfkX!h~ 0Ѧ[Nj|>.O%>i/*}4 q?,qJ ݩ ?=*{a{֤E ;R%ʸIq'Qb$C}Ww7B<Q?j l^|3ZcBZ;ԳxZ'jg ca0n",dzXZC=S4?4{I&N7ƟPY3\n lYC=W@I%@k=0s‹lr?j (䌀9Etog pBq͵zx>CeYTojݗ?WĄ}# O -x X{+Nl~R?Mݸjkc͐]$ &}3h-:l'L1SN>XSqq(jgϦe5\Jva`̮oU|Bw o{8<γk,3zk:8RF5$?pk'xw 6{ckd(Bݍk98 Bv}k?0Bk>7{?IyZQæojv6X7X=8ȆVI)GB9<83p3S=^~xysFdk. g G &]@<2Kd7yCOKq&"@aP"K4M`|0|Vǘo l5|(Ew7#ކXJ -¹KH E"4r EӤh<||_PC e5n6 ɂS#.?>wgQhC1D8Ldm79=LGʬ/>ԁ+A:u< ssD J_ ٫O[x5ZN8O.! %v6$&a DR_ wZ0=Uz>uhi\BÁo%L۝%LdzmmNq`Uj$wCDlւRIlSI»^IOt4ׁX)Q.@v<؎9?3 @P'}^Ţ(:=LXwyl^Lzi~j-c"9#,2A]I@ V/8xK1mxRԬ!0.8NϺSvWS Ӝn BC{Gco7:ɰX^O+¼H1.Mt}iX$.߰8l|.? 6a5ǯ$*J 4$PfKɒ~y6"o}wwPyBH 9 J2ΉLr$1Ue ? 7%o ~_ҧ &M]{1 9ҪߴeQFߔuѝ8 zjhk,N?qz>('FL'U)yvR]cCݱO !з  JDXۤU:4}DPFm*i[>/?s!)8Ђ^ {4!\o]p))nN-u:r} nPW<eռfxb[ϲP 1ghǁ+p#ɀ$I'CMk4W 'd|ګ,/w" VooWB U<"n [wf^*TrW µqKYHyiA8yys1#YOTSĢ?RkO@׬:U7]߲ qjg1꾪}8 R3V L$2swn+\u"J9EBz9:s!ED?+HemF:U,]W;kV%ӏ~7puc@?ܟyi!Xrf*-`i,Rvd;Ψ7LJ7ʗ8GI'ޖ)UQ7oD4)F/Qz~)L sOZn6B;/QL'JjZ<+Uzߥ?z綔p/h%Dovȳ,\ Bhg}o<6I?G?3:z3zo \8L 6~,?nWB)-#5Ý/+н"4eUԭ͂ Ѣd!wM-ރY̽`9DV R=%G2#v`u0ch7Y:(Z ]_5Mo9>2/,;Yn^:龜f*>p0¼o\\,2X`[N5@11f*(Bqa-i^-@$eib=ndT) fpJ ((Odqy@iҡ+sw,4N::%r^SM/9`Lgf3jir%2hb\hZFhLYEܽ N?4"-5}OGU4P<&Ze(EH ךk\5T7 SޏոXlߴYd&Ru$_G 5t.jzE%撝2AnX@!rV\҄O_m40%5hl(NEΜ\AV }g. )AzL{=s3E5 ҋYV\HfH(MzܨYh[lVO}]m=^|" Wuk ]Qsl~Mx6 {.!\ ق| n7mɼ`xfu&opg^mA\(Ih?|}eeRT<ϰ"H$G`Q߰\ Bπ\k !) = : u .NUydiH w)2 nB~v&:]s+ґ'Y"$)9o)C,I3=141BqDBe@] m#WlNJ U[D}%i}ycnmkngY{+&k,y~ӣ_L2Wlv[d[c&VHփUŪb<'6o կKbZ8q+PGET;yt&τ%,ao%pn:Y?&f.&Lu{x';jr3_oMV BaU`\J}4X\COS fd(i3²ov-bayT"KR<=s,*gU+ 52(VDy8K4.w:hb{wOǛaO*n `@cRH'4k+[K^a~e:[TEY3Nj'ct#WgBYeTm[*8Fଯ;t~e(`wMXWԺ)zϟ BV^n>Ny|9tr͛?M~-<_g//Pw*sxc3E%AJ#`ޚ@wʪ?䫈ɍޮ3ZQEuWMi:xq#ϡfry9oQ_TsBr q iFp%>a`F("BΝ$&ÎEuJ󨩒 p͓O 5iٳwQNYX!C42-hb"NF(Z >_H¸v>.z7yTX+8a,D$يrq8643 nWUTEP Z-+9,YP%[9(A\Ry)\X EWH[csQzBʜWQ)<_zũGŋ$< !E" ,Z@~Հ Uk_+D\dCsM4˂fb= F e7 $dt$xCpb83ˆfz@>v՘\hŖ@GZҤ) l5Z9gk7úo/?E{>v@#h<,l$˵C=`:q@ Wze17Z>P%/t \nE5-^RE  [=@ puqq{9Y[g4C#MK ,ӹ0P+p!\\QZllr5y'= hŇnDg(La|F^^. %=ׯ_B6U?HF8 IHgj%9U2>~q BzxzL!3$0bM8͸Z8UNOIƇe!|_BW[pa '|!wBh Xaf"ei&Pಫ6@ځ73<6a$F>_ߞnU<5Ea?yb6$Bxz/fyBY}4CV{h4 :#W?tR*,M.z6*"; lWR,cQUTJϗ]UWi|,2{^nлGɗj琲S*{E8^3>k菏ڄCofqPBQ s&TA2>wkR2G?E;f u!a ߡ&:UR}BcѲ{4ߟߞ/g诃6f"z=!ҝXO7OwCZqV+J|D U1!)g<9.)Q45` 8gsH0Zs-O¬ _[(p_[P}o窴K[fjv},] lba6eBםmWvz0( ƘcQ90PJBŜ*&1w+'3;kZ,#fP+w fz8޺X=VV% <8Pr_wU3Ŏϯ*/`ԃO=?"*#}JGϪes d B=˫y8bf m瘭<.M7ӽ5-_i:v&C+; zcs몘*Č{}u8rvɧ#y,|DkM`z]8(4 Q`{ `NI0gare|Ck2\yK[0(Q֚(m/gAy"-Ҷ^GKY E\ %ATHk-#rp PdkxNgMp-P1Db>j}o:藣vxh,lCy(}O7vq N3FzEΦĄFnGoH2TtMQϣgSY_qlk\vr|>vaCu:Wkf\z ̖Nu2՗<$%'FuM4l^h~葀TzyL!fD()>a~Qiݧ84siuwrTQ5 "g{%ҎFxMύ&wj;QSm'6 BtE:`َ%gJknb `)W8iWEM!R8Ò#㉀KIRcSU_ew)(3d-pOي{:ʨDT1EURjǍPGOE.X4sb42mhJ`JpWWV%LmlUOge? )k:RjA:rGjtd3$귥Rk|ag#AmRcf(PܶZ<'}JTN:'<D:c,Zj, jD~VB#}ZUJМf(֯@_歵(ы=c (\hik-D<%s@Eߍug?5C6U6d=;ϟc=/1Ry{u5b-d_ӈMq#yep+&PO}l\샚e.x%q*+ \e"WQ핦Bן6&t6jJʐIɊMwޝzM9:fh&hgD=yl<hLUꉪG4X^D)9BFY{ۑ N!K0Nz~i!Gj79磃yKu=˹uڝ Ё#Vi!E˶\FT4ctk+%6u}Dž&XcWvZ6a;K@Ҍ yb'U;q"FUG(($kN6Y 鈧T/ `d22mLENA$,KK$iLrn q Ԓȱ/ ԍ\#su'뢺'%Q]Gu vQ559ѫ^]2"g8Qta\L(9{f2帏spM S\3Lp b $TAkv E1WfK FW|XCz醕{:N',t:ݸl'sF0 9(0GjBm[)XU6Vq#"5Vd,SD DGI 51 /e?tw1W&hZ/ oG^"Kù)U™kSsv8ff8)!('|?c% ٙ k9Q_)amp}&wa-qBr[rTV/'FpexX@419A%XbE4, WCUf;[uVTċрJX 2KQqj2FYhXҫ=-{/(ER R օQ5Z "Q/ ٍZS=K$T(2 UZ)+bvHv*LK%%RCbm@cKy~Cwxsf6U*3ͨrƨ I2/9P1.{ K…(q0t .)$j0KZ5ǹ'wm:ޣWATMf=Fn> MOG2^Z9J40h'onY[֗D@(~ɼџӐF l[wsdo77/}9q|fA_|;???,"hud-[i\ߍ0 ^tniܕQ;o^)CZ1ms嶽ט{oGF^.]MCF5^ϥ8-?Msz3c[kz72ӻX7}bg#kn;k5"H ۖ7 c;{>bxl zD~.DhkƊdyP(4Cq3aa1ŰvPL #ȶtP8qP eC!/@CѲYm^џE`Ϭ%֤Bzf?.pKZ Jth [ iM֓y6xA=D[cM(Y]- ޮ7}_z9na&F WQ!'N&.&w~^>܉^v}pzޏnS kb rp̊7 Z_(mcTrֿ zy6 flC댕9dN\AUh­7*1M3 ˴I,ѡT+^-)45H@iv(ktANiA]>9pWkeCQkǝx 7QRRJŹm޽&رɣUx"P4c`˙wxSF z͙0l^+%)7QZp=r"%Ep$d݁@Sp˜]Z!"m|%(&LK}fۑVSq5y,{'˦u͵|c|}2#wMy@83ߡWԱהPţ: TG#s 2^?Yf7p{8ih`o'7y`ᑆz0 !a"``L5ۙf͓Iۻb|S{p]j,WOFnsrWHd8Jâ \OA܅׵w{-ssw6σ-UDQqq*k} j_ lmZY[E2iLU\c~8& ;|oʳ_8sK{U}^p=ҦyuGCtBz@xK7z7L)n.%zV /J_Ѵ%Nc scrYRCJHTe$KD*R\$.xmqeO9"M/A Qם<.lȥY,vM 5S3"cxNO.}*X[[,TC\'[Oq.@yr=_qs ] QB.|+ʃ'.6'Zk۫F捹%AmGT{Ldxiqo~ޙ\ߍ^2ueT)ݥF;}͗ęzH vЭKPZjHqJxG0XK0%D7|^An7`6f)Ѥ1&ߦGN'+i};UG% M(Z]n}EMeppeGkS[ig$MDXN|˲D0 J@ g4 BԚVցZ)!k%Y ɊDtB|O8pGrH q#%B5.\>ՉO 2a E49X>q'?oo\ppiƕ'rށ#FQ!|Xe $<%PV9W /k~>O7ϵ]4;2*QxtR{h3"7$\tSHMdBQ:b%I\LG( G)-r[=8Whf2O)I`j u(+#RI0!Z}Bι\Nf&P\fu!x`&7pɠa^B-qcZI!s'w^'#m7J'sD|=*'ꬵHKLF(b{C]A̓l4pp74cղ< &=E]py.*;6)jxv?}7rNS#2WTZ@ hMlo_q#ZL&J!P<. 6]_̢A#\ QrPƧ@ZO(B4h y%{Ӥu`R&1c& G"%[ǒ"h!<-#9Vm<<]yv4x}wW++FE=/뉝~S?CJXݦ4d%RnulϫȒ[].i&j4Zh%J72>zUQy,lSd)hh hf8f Y',67ow Aq\O4R~ 0`[>=Aoz᧴P|TpV& =~TdBnkɶϢI$sǯtk.Og_^b?T3ǯX(Ï>wWZ8od>fr_t/c3ees; idqkK2X&gֺH_ GD߁dݝ3~ފq:= (\m{C6=̑mS ;5D0ؖ=젰^UU:BE[NUU7\iOEy+mSт3Rv "v :k_rġB)p.ye''~WUȟ?YL1Jϣ:8|)TuO5gW]dO`'ӥ|ѳt* VϦYK?MUMwveRu bޙ}M|~ \gW{**cbXϥoJ?( B]bȣX5[cwb\Z!~AzvVk5w0:Jid@NNlVt‹kxZXiYͦx]\CwBZ ?ܧ_Ǚ G2̤V9^oO#`"on>bey]5@w(]hߛWFxl5%c+L=t=ė&t큵^8Z5Y'sGh1gU+؝VQkxh2k4{g2ܴ38?O5n6o fk٪dyi{wi;RHCyN; 2&][lز=c~Ƿv|vVVmA5G6M#}3}d~rL>0G- %WlktwF4^ ^<˽cHNKw&45F&(*2ƤWf/l6ID!STXWx X/z^+6OvXZs?CzNX lɿ6syM#Pf{~>CSxLMV' w&=)/Y8s+b{#&d쾿{,S֚!:RH"9М8B:P{74)32h@ϑ,AՃ*X㊴5s@Pǘ{8sn<ռǹ1}+7l(*o Ar!ʙ˥r~<ttz*8Imm*Њ`έ=*Iܠ )Ţ*zzƚR<6'g_.']!pN]~ٹKO=g5<)Ő Ozz)dy<~dX{W@VKS)OѶ)R7JߐNvKNX]%'nZbڳ7t&;}4R:0GyDd_}s-M/"ÞyץWu#O0jPWξX+.,/CI=ۮ@ {'^);o]#7l0vURwPinOun#O(=v~d+^9~m~k{?7?sXr AC!kAj`ht`0tPxi L2 9scK@ [u) FIu2\Z%jhGqP\27MV@dEjf:@hZZZ5vO:!TgY+(\V+ GCi[pPZaoY`0R4ԍ>8$* =w#cm\Csud+H-"*:dg~2c#*r&N[ϴxaK93 4['V\[R ⨁4F;ɀU?}jḵRM .X,$Θ")s+VnhO5)'ȓHlE:rn%#y0){)ۋd_kߎ~e}Tl11~#Fn}|xSϟnGWU/BXQ/M>t6Nx+}ێ/}1yte(~|O#;-#7߿Bŏ JzVfV+FB^ɔ< :fD1"'ii9@1 co8R & -ZP ʈNXy+Ƭ[BZ*$䕋hL3Q!0di @YBrSXXj8'Lh$0 y[tX7=] -Rc'n.Oe[BKnH+g4SqBPPA%GCu  @ 9 a!<Ⱥa `B1(#:sb$b"n -kݪW.A2y֍3f&bh pX-a8940<`T)ɝtA֍Nn9Nn tWiG-xe[E4H*!8R$&(# _:f*ިQYbU-2:)֭AJ(2nв֭ y"$S̓) yk@IJGo,[t c0 $d AӢu Šqukvyʚ;/uBB^gL% fƟ3$P`$ sBK:GűUq(ϛPU8߉i 9N:6]IDLQ'Ѕ>V !\Ddv^ݵnvºbPjc $߭9OIs! -U !\Dd Z7w ĠTP#pnՕ{ -w@B^fC5.P>ԵW 0]eC P]& ;_K5be+Aewcj++Pk+w CޫC=͚-mfUlֺZK5.4{*4c]BC-P R&0BS-P7%ƽ]$Cy1W 2ڿ3 vr̯"LD1*Őcr̕j+v79f"rCRM(6McR( =1SȐcr̕j9fJ%rCRM`@/9!\&Hcf(c~9f_1_,q/F4{aUwmN̦ٹn I#% ,Hi ͉EC+rv?G^fr9|2C]䃽;Mo.W/?YL{;S1#|P?ԝ{^Xph2|%b=;C!X{^%[+c(5mV|Wl[RcBm4Fi1͓ٟVfHKRz}mg^3ʚX'[c.d3]g&04E:L}u :t:<šntY5[5E<ݦPj^3PYr*&K}ZOrF. kк\u3ZLY n?`o>3E@Ps;xQB z T;? Ymi>JZ[k3\1K¼^1On\Wjg9 W!p7w6K:MR`M0(p:GX+}8: < |u5˺6!s"3 (-VlxZðc**}Q< PMR|`g,ZNC=n=j-ee~!nd36S RkY$HE"V0BUAt/ۏW觺cm>_ "$X Z *u}/΁9jXuc(.|?dOca{堧Rep!ϔzÍEIFKrƓy c\&~ ?CYOǩ5ʰg#2 Lފ\8 4uB'2AwBXbHkX3c0ț(d<oA:-ɲ!Sl6i3V:JCǾ c*S`$-x}f B,u#h]" &@䮟kdDC`! B f cfJҼ'׽In%!սJWF#wIHh.}1EDE;d_B~=:U{6rU0@ (~Q:[$ھݒDΧ.Za}?VYVyIoe古ˮN,&V,.?炲~pdɒOR派:j=Jʌp3Wjf CL:(BY()[܄*f4]V I- nj!| @6f!B\ 2!ƱDmt /LR*u؆cR B .-fbPB 7^t@8f1KhL*@B A^Z%,R"CM̑c)P,=ÙEXzsZHaJZw4P`&0v 8ujK ^PpF#bls\ƍXYcX wd%)ca/9d;KPcu $(NbA*1#x;/D+ ԟcw_J(>h~O,Q79?Ko;''ILdO5DRf(.yb!=| 6#F8EEZ/6~ I~QܒF&/L(IxB@}M_եN0e}/MS2e%% %yD'-}e̼!\OxĦ^{<ޔ#Qn6hSpSǰҶ|r<צu۷yEbGE6_do/~6o .խU6okj2|wemI 2Pbc:²=+N c'߷Q>m+$GuV~YYYYy.IQ2u?\{dSE :$^+^mՊ;=ÙiA , 3i9ΰryhʬ*y~E«#/Njq@ hNǂqƎ "to/n6_^>W(++-K:q2)2`+AWW2{wҭLȥ[e% 8Q\(N*>&0eBIwl_'~cӯ,8`D6Űo;*KHV!~?>uNَ^lNs/Շ- CЫ^? Z<abۢ};zifApFq)G|ޗgeϓɪcC3sjr>,u 7{X~&Uw<'D9 ,)CtNIÐsF*ΑD@!ů d:XTy`4p M7ыоC_ 8E_k-br hn|(߷%=x'Ҕt?ȷj2[z]~;`ф-fE:b6 ǃ9`atw~nvGvfZо-ȶs]Hۣ:gmsļ߮GUh)?~bnPhL"1\QQ f 7e#bG@N,C(b  ~VD!<*T3ϟ-@̴RTlCXQ[Ijb:4ִܴ?sXn( RݛŴ?oʴOv)%hôm^A˰y^ qC qRp gTp ‰rsYŔt\9XOrer8 vo0DN'Wl" o@ݫ+?}h6_,O ,3e/V)ߙ.H"q1Eܾ+dH`RJö#*Hs1!A #?,Q~0 HߤƔײM|rH9"Ϲ O]ڱnoso8!(sQ{1A@4[9 #e d]m1Q+-_뙮mR 󃝥#GKO]>aͶWr7SJɲ/s˃? )nACӆlt}JK%(]ş+[TK."„0:46R(1l s*#nFo!SoQ5$[tu4%E[ԣ,y[;Qf?Q'"8cD9U+.XRy5Zaqb">LRF;b!͔ ·_\X3 q"VT|M<P1p^sE9$W"yqTs&%!lTP\R Y 2qz9ڰGju^]6_$^w2"HJ7tUN{siT<͝32k>Qp5_߇-qѥ'FӽfP #߇Oփ W4u4 m+&蠶U~2#k4GA#6q,s L0TJHn'Z:5CImtf 6LjMF0XeQh>Y Dt'5>3_ψO^w"]:lֿfN/W)s!4~t&|ĉOxpwv:RxMc"s4O+I "{xqMlF]RAPW[jY{&Eᶈ/eɚoyjϮ ?' diC6$Pk.{gn];bh%X8.$%P֊K@yZB oΔx3w% ?(qkY|% JGWF=;P&hw 8߾ b3Mn] <5 [FSɞ,$-kr͜>p%+YAHwvW/7ɥ o?>]XCgwO<@gRFgB &aƜ52%فgZ5eJfo+X^Ҏ!n_2;!]  {5 jMp K;Q(c$oxo-=Rf ;(f dRv ~K/ ʩ W$fFSy>.T0Ŝt N G#1BLP Ѐ0YłcBB;đ2Lk$?jpC1wfpxo @. 5 (b/ 3hx=wxPRH铭lebfVZHD1%S&9)c""~deWbOk|KH4`}Ɲ ZMC6lHcϠBh%V 7`saPSj, \i6Ϸ-1!D\,EgvfZzތNQRډ*K6Wg3d9&~>_ x|c19_ZV*Ղ+M_R_N_ɉ/ŧ;d4RQ1guqʟڽaMVak&N]{:55ZcB96yRY6jϮP҅i|0o P,8k2Gkq]ݻ̗<9xbQTCW'}>9W?-@)vRSIy`@KW[j,zQX˳t54E-r<к뇄wj.m-g96ҊE3&AtOwcqk3n_D!ii5@aفgƞ"%, X BPs~t7d)T)@0ul 8D D/{+flQ pRi_9s%&r\_ӹհpaFco~RK#՘ƌK K.|p|O}P7%M+%gfl4[q(-XkHቨPp@<(2Dy+!VXhbOFMrJjjg|q_Z'qQީwSRwK1Cg%6D p[a8tH79[kEv]>ȂfôO *ñ*:>GwU-HX+j)0S1)d{J1_dJ $\rb0 a+e*˽*0 V@j['&a6IqEr@= íbH2N&Fb4̆ QtQ`pR\RS +0w_w8DbAcS}! 6~~' .y0@8Tȁ;Å' V/vNȐBX_xX`W{ `{!'p80k)-1(ᔛLn߲m?I%z1-e~w:8Ӡ.ޫ[>jΩ=ZB#%!C&uVdO7R^XΣ4(%M!ym5Ӊ&-+x7zWC@`0wYI| ڃG !vd"XewyDw9BgwkZJMl:mjz#wHތo#|c9?+ 4ۥ8&wq;%THKa`wQ&ϡ|cC2" ͔&, ^{F3%l9E!\OyJ=xfKHKN?γyT9ZP`XInYfQFepp g偷!K KCBwXvE8a86d@x89+&pFPd  i&Fx6DN%RV./ :,n}Kzmc]G'7Erg(*8f JҁKZQqk 3-2t(!R2PZR`zx6DG.%RZk'h.z2: ߩǠ9_,3OqWeN X[PSpUS+?@Tġ}gj"gs1\y[bѲIxPpcox*k=e0Iqkr2^3+?MgZ&p(34Y#NbRHy\2]s7WXrG jUw'/IJTHʎCJ, 13m\E_7F[!G 6$pnGY;h R9 ںyocm]<𣵓ƳzC$lsb[n$jYM} | J75m:,oZ|\7)rU -mt,`BRqvk`[OZ274Kl*HĴrMVd!vkG1vkGڪ\@…AZ1T9&8 8q J'unGYrrgXc5u~$UJͤcaQ4FhU&0EÇ0PpB``r#y%4V0ytDAVC7t z2f.~[6 (1mp^sɔ)HZ:YH*d{HZG2%F (YHj"x{H>g5r)\c%P]_JęWӉiOJy!EFPv&E!qlR"6fBbbJ-p 1"O[1R{9R3XVi9^I68 %Z3X6+ςC~n}wRhl, XZ9, y BAA͓.t |I+ Ƣ6kբ_0L?H"!0 A K8" (  &gց)% YFT)Å;\*AA; BR_Hd('9I/or%HTJ<-s7u=o݇bfb`Pt=PBoZb%ev߿"^>|߯<0/xs=~&Q7wW} X*~ >lgŊ'۫>|B;3_|5ϯߙMt^v/d60neh6* s 󌥳KvrЄ[7*;;I lo&bh6;8["R0n.S%.hҁagA8-qh9YK$5a Ƒ^ոm]%L tKzsE -jaDb,K\( D.$l^(!q%#zi$c@ՊE0@ CaHJaXG ҥԂ\#2Ia#S^&RKU_&,M2HaUX}>;  {mЅSK15BT$5[0ODNUI܌șb< x-'2~351gyhz[(׏aA n2W///]7VP3%,&ƅ 7r`я"#c")\!֙vkRoH帯&1RJd5)Z1 nN"料wTY)C{lzg\Ca'fx;xwe e9 B) v(-> xv Vj@=0hYC³+DpV"S3 :^ʧv'3x37Y10i^\&ό]gV⬝$]U>cYpz3.-{^gܣH4}ZvA8UՊOGcH/zI$qTdH1b)gؕ,lO͟#rh_JSiG2(ʴ8=鬷:kFKᎻ~/6^pr 0M2v= 肗jhA6d q VG!8 7 +C aI1``g!G*Hg- [8lmR>f}$Tא=#8|T`YG[iITS`&Loȡt&f=`AV$e\1LfݕvWӒ̩˝NXOQ u/ai3DkUp^ݳ w ?+\EwY;}}MϪxJ59PIA(4u:DqZv~k@ np; Vlz kfy1)L A7\,1ұxA`V“AsDup̲ ?ˋ ɤޔY)xpVI> Da)XaxV8KHD9Ȯ0?hAGtbXXq@ýޝͣ,x h4;3/"[R0#?Bi-Si[FsarSH=Mi~p{V6/reIH{s2@>I;"ZyD^5I8!4Gm){ ڂ=rj$1h_(ARK 9EU{;N+sPJ=;~Ͻ5d3{9}{ߤp|{eo[riÊ\a["🆽oo7 {]fY~}e&d=1=Y{WysU>5+ۧmvˎgz{"aC ptiЉMEll~IMQ6;9emdeG;H]X&|{yٹq9*U5U̹ W3`$xA@1?cUlڊZN.eQfг;eT@齰k8%ND\JsaOCER^EEW 7w@;jDCnFYa siǕc)jҞ$ }xˈ%X _H% "{3t_uOܺ'gCGpX `T =: &*&X#W;P!GA alaA1Pa mbCY ykihly_ l3 HTڢZTr 4ENX̺LBפֿzEY2[_gwnY:N%1]9]|X۩U S6?/7տ~ %q@q? # ͑Xᑲ-`n+_f_tnY#۸^JOX}mXTXLݐ|"zLi¹:RRQW_S)0OXUm a)J0XiR.a(bA1OjA@%b̓l8iSpmyqFK4TD$zuZ=Y{\U>T*Zz.7HR.fJ3W\P)T bEp cj07SkL t cJ6]-&C -ݥ1Om1r*Ik{e=$ JhNqK/7e=4Z\ŷ$=/%!q A-5$Q'Ƥ-svBF×:kp2eGxF׌7t#jY ]ĭC 듟}䦓BVT>)S_YΓ{7/-81͍!5~"(,] |E+RE8Br%5,3xj,VF"RLt P,@@H]ik`5͚)N (frw5uY beVĚWQd*SDqTZ5r+%!jN9PTZ5 =[ͩ!J撫? FvDV 6?I[Db=ˆl*yl{@9mOy)aC̑JT&w<}(,˴v<_>N+WIr1Ɉ|y~ye!J-OaMQn`i%J SJ`NJ[6wXHaV v8* aTW&HŨ]D;+ԶȐj~Y{/\:ʶgMWi;s5na\) κy>IB*H;UΆG;4"z>[΅RjpL^It>711aՌ7Q=oiz^gb\Uo߿5\OF"1`0(=}UpuEt_C7'#FOߘD+sdjU|3 SKf"S>NA1ͦݔ+z,|iN/t3M8e8 evǹc'>x%ehuIELH zraQ~j{&Č?z5Nս`WN2_p>Z¹~8Gq6lUg7OHC05^E0DFrZ gfcY'("21MɔKeٻ8r#Wz݇A"q"e=}؍ ?98G(r;/UdK=}_&yRBk/>Ųjn],>{.oJ!Wҥ+4vx tuK6喵S =0#_ הRfc% "3#m腫cMszc֖֡º +6XQΕN/T\Q̢ɪu29DR9%JĬ|"bpG PbfwФ ebBJz=% /"d+ֲseo+u s($\2Y$AmZ@U`), lwI-()U炘8JكeHjx䃖ɑ¢^7VF=`2G,q@nek*P՟WeiKY ~Ձo-ݰǪ6x7k7ru'r?|bY۹m^vnSŹ=R>R,B"\Gn2F݃^_~QʗOnL#ai bj& -;-6dînՒR|3APvK;8Z0j ̴TBe*kR"bVxYL@2KQ㦯3u'\Kf?jBh9ge}wMZ2,rw `1K()7!PSnI9fko`W|;-(hN1_Ѡ4,T}vǘ8x@ DMŋ&}zkXu{[b:EJʡ,&XLA T3gcpF2O 1K OmaO\8yLs{toEm99FzWaRmhVԠ~rws)f}(,4 ,|)l->JKPKt6^%9AL^`2 )̈_z(G罾y aGNPv:&7̵¢^ gQ7iZɮfk43Sq&bNE(ޞS'jّ ?RcYKè~1{V3e~eeZʣ:v5X,7!ФV/_ bN᭸{| z;NgTb ;2.m`Q:6x;g1 $c( X=Idͼ-i(YP2т]I\:g[~$:]U{wySP_9b}?~5`/Q\u.)fQY|LF'74&GzMܵQu>L5S|'gQ#im}E2G\x|XKėOH~M W'#M4D> 9lj^>sOשs1/߻ؿt]}|o=nj1/]ESEX7|Gf+|u5`(պl:%^>=Mk{##ykGK#NOVf}WXH򶹱  Kp|U.QW45Nͅ ġ#x t'ĠÅ |mm\]3_ae| {`_[qE:U̟jU^5/k^5/y̹"XW"& !c3N[f.V/ފ}0j5~^eK7ji~}3iٺϼ7[_?X'R3Vs" -.(үE4n\pZLmx[h }b+* q;VZ4jvzϖs-]F߰uVt}o]-~]vkwǫwϟnn]Q퍇Ⱦ(sg燦1 ~qcL+˞!R⳱ . _,WF1 cf@HMj<o`$ &e)v%XY.#f#@P:u4Hik&n)~^se8Eh*.biX"cL ioPd;e-;)`H"~0w,vgD @% 6 edzBg ݦj"B{"uoj qUrfT`.$6}lDȅ!_KhJB)C~(:5 ݆FUv&1wW~?^%"&;"Mt.x)eT|OJgϟ᧸yהXS$ii:`Dk97߼IOT "V  =u1+cp'y(T\Eg^EQ)Lh 8 V>&ET=1p!Ѱ˲]Vs"TtFcNfXBGC֮ Ĭs (*г1~s*9 Z`TQF)|$bf b]`s(YвXM6` Z]vɳ٫yWH2y ft0sY01Eb˚cȚ bp6 }1<3:M9s{&PBf B{J)ޏղ/ 1L*1Y}R"z@(]/)R;^ j!7ȚiYMHk[ \h,uwqњKSw<91nyL:sKJ4p}F[uS!]g\Z'gԂZiL~4^0G ٚE jmhiv 2-ݶ =WU[yCᠥ Th7brc1mt ɿm8^˪r/ʽ*rI5Qe'tZ:ETEU6ʢTѣRI =<MrbeKBve,']jʶZdGH82ͮwl+#~Q\~Q>/pr2! {JŌƪV@ALuɝPQjUjeWk#<ɝ4MM-QUE!l9PiJN#"\rI"ZzQHKLjC r*#YU9Ǹ4 hDD%ߕ;iV|"{ lOlf fݨv t_W$ _hB'+\;${['@Rq!}֍278t*5epo"_ PsQ̌"Y&mf(sɊ"kXm ͵hT hA$7e=6k_OcQĭx"%jn+tL h{:]MJ!e 5'+MCyo eEŐ0JUECd<~x\hAeuxd0gҝYv|uʓ2BESFގfnps7ߜ(^hemΧA9Wy1>Q&x\3Ȼ',)IA2U-ѻG}s-({4K˵`D 8!ښ43N0OJp m}=ۻ+=e δ`@1EdG~蜃Y? ~d(Q]s+TBPg]_z8xuw9,>o7<.WwdP|YJ}Fmw*n)>beE!g9&i\T\7υLo-b<4>ҳօ2-Og"!6;EE/l6d(^JBʪ(tJ( usU47,h7 P׊o zr$~]I7xm;r}he xAkw%IwoWiR;9 ؆~QAtlu7̦?8JKEvv~5>4^|dC J\H'?P}">K wj=-c}_CҾ. l ]NkeKZ;CеVPSR0HE(*̥F˔'9DaݡR hI*@ъ{C?58QKӻQ٥,_h:'+r}񮄦K-h5ntCDBjŷu96IkmEg(?HBݷ$B I^$\iM^Y\A|&rYfLXo=*)]ְu.c֊I1yM=<(2Eb}" A -%/#mjISRЬQ.:Qk:#;r(2Ǖ@#!JnNǧŹS/ݯF\Wa- OISHTrƼƼxw}1(H![-!AjuM(3zVw?Q2K=+A(c1)9acAlKRrS*hYaF(NRșE-))ky􇄱%:t.F^gOK :((=<ǯ2 o^ޭ&^}φl`val)P 6b#"?Psʡ6-m=ِ%T06TZJY8!p obdb80g9ZDYl a3%%1:4hE*ܮS BkaIv=6s*zlv 4 [ksdGFJ\< [@ ʣ(ԛ<.P35qȠff ^l {fnSq9pIpesS Pr̽'@}TNқX˕T_-T#Tcޅ:{A ^,GmHSx3o ^AÜÕBm1M"u&FXA5XY{!qɵC~OTpH#I5Qiz$j։Ѓ SiKNIG9])47O 75ٴܟt t74ktҀ>" BtԀ՘^ ?0l鉡p`V2rP(80#(8ZL#"(S9xѨ&H΃o v~n#U?JP`ӥlS Jiq+O>bR*x2_n>9Q4|ϧ甩!?B+?]?{ȍ}^),sdws`M|Y["[Ֆ%ny&FuaXՉE}s $Z/Д&:VɔػLԐ U)wRwkP֝,;$MB9 DS!4W6 韢i|j?9Jb%)D[P-leuޤMN5(\gCQ]|6ڑet>/ݔ<3CeZlYMgҠg=ןx+Ěvxybx1eeuTq%&H Zki0 . X 4B17%y> 3aU _[Jf|$L(n U ,X4ўD6jZIk]]9b(h6.ӆʀܶJc* & "S`y833¹1z&Qy<+A" 6;(t19Pj*F\ԒĦW]-WYQ|Ɯ <:^ _Tw(|vLV3* U 8`α\9B Ժf:c &kց"K1 W#h!*%KGx0w4 'K5hK7 PFHmjHeBP+-| b.blż?֊͞u4ampjMW`m gTtϟ%E$T糕Mi|9!P? :O5SAY`fq)@ B9n3g2icR±owN*ZAj] ʗPÀ'V3 4[Ln ZTm4ᢋ1Jڡ(+) o]ԸᤑO99ÉjU=5 y + X.Ӎ7OF9?7Y}\ oPRuTg@S_&,]tb S'ͭNR.䀘|Ӌ>1:#ˑ@>_-7KZ"65 J0 4GPY \Ѓ^HA&|(;tzk-hwuKգE]qe:Y I՚31NA9_`//`n YRy$whu#2qh1tsw"" C:z 1@N?5ۏȧ%z{y*hRs.P-Fha ӖeeI.84hqȹ ,C$Z *!\@Ix*`"ynLG bbIJ{LG(~L Alf.sD jԩ)3MHˇhׇ*ESzf mDNV1w0P|=gi-&m+t\1Pn'T^Ynnr)n/`zXE7ک`@7feX}t\Cej)Te|8-BeqZ?tia-',@{-8OQXGu[Zggy;^%x.aIs9:2GbLu Gcӻ,{Ղ&ȱK(֝m'.{*X50dit$(W|c(A8uDJMKpMʌεfId'}=%U?e=t%c%ݯ>*aZR5kKZrTx+^4R e *c-(*bN 8Rp x`l'0B0)b}|9.9{%B>?)(:PSpɵY "#F[)'$@< %n*8ظ=Oj"8jumC魝&EY[ʈv'P&HcvGix։g߶s|[2$8i ú5&?Bu XF5S"j=$}*'7zm-]zUSwЯjj>vށJi VD=zq{{]ܾ"ܽۻHTP& v+^^[pײ͘1cT~{ho!8؄R8AD<8`*NVND^)ZMӞj; X3MS:IS0Q,Ǖ%t䎩W+C lzFd ^x_0..q*0wAROBl)4iwHuxrH۵~%>K}߫Էו!&L4X2֜KcGFO ˨Rsh_5h$ANE }].}ۓ^ 55VMvJ g?[@0' V,dޫIʴ4w"c!% 1y3,pЦ`@Ḵ"r (CȺ:]hIe&cg);K!f˶էOҙXa ?40geSِ`y ZiWVm'_m'!+龹7Ӛl'liTQ ӓ[8`3CŒz)!h9(~)ﯮQ]CrJ* [c3 WҀ>]?f?s@[ )2<<&3dNc LjN7޺ /Z_6sOU,VP)o%#D3bj;֗ͽN}so3G!K'乛|!G:2Vt Plk!O["Y3"4l~#ByGj_.M%MSH;Ji8g ψx̮֚^\[{Q9;sЃJ5آSpﯯ?FWu]_(3:6Wws>(Wts0PLo.g'/kGwB0G=+M>yǍ-xM}-Pl],0E,ѸTO'Ӹ^'0t\ˢhtPD1D5 DI&kx9rxiER)+kSWMݦ4 :c->iŖsMl`1EG*Sk; FG{ =X=R&g OzS GLϗʠ͑PWh!EEEd;{knOwr^}$ {kmVHCEb(6JE!Ћ}L% Qv3֩%@ QśED?{T忊bWdvѲ-z& !ᄃ;IH|k\!A}oկNg9GTRGlw_lZ;WXDMKHk<\S''p8}&W&+L!<:]K 1W%.OXp {Xͱ[u۸ڀ9, W aZql6a/2=={}u*\v/9% :mc,|~lW\0Jq'Q ղ#6c#A:^^taB!E)Hnl& 7ṦdD=2rG]У&԰JI2"#D(Qpb*Y^NE|f1E$VV7u19;cM^uf[؀rYû ׭!ўdh-hԔ;u'TI J^,#(ɵ 3ƚmbUZc (, 1dw$zE$$ S`'Ro0wIZq6jTplTsʞ$p2;K~i,{90 a\>հ RbjoxrT2XW4as79_Ʈ8u|rK.7\7-y:Op9'llins6ZmƓ65%gզFgLRLeTx+xdgFwc&3Krώ[k.e0ŷmѩmp4~<#6WeM7逊 A )pb% } v]~,ąl80i//py;h{Zl!p56A(6;D }t'+?m%!xt>O] ԏKSdBJ_>M0FLTM;ZV̦i0Tܫ-PSW~p`T(ӓ8>T;7s'`b$4 'C&LIa^ L\wG5AƠS`KM!+t`E&nUa\c{ƸQ5@hNbQ饾k;:1)XУBWN8Ã/z`Z%ϱ稵ۯ;8NĥIhgfEVj#LZ`kpK3w9d"q:2zr,0!*88ov廉`N7U! JGz$`Z4|H.ԓ !p@J)o]WVc\ݼ1M9rxwo' b+BPθ#J eѠ{Ƹ"|CB:oto|j쳁]4.JW>.0U$ ڿUD?-Xt2ÅOlQFSϟArh\VΠ k7ۏ`/-'9AZx!sHTFzl`>p\ ۶q' 4 5eHIw%j56eVذ6aC_ۑN"|ٌr$aFx?a)" I H ~< 4>P8\t!T G̀XQ!Z> 4;" 3_r#vIAA`bc [6j#0sVI"KbʾYrΔg aFĔ"1x4Z&w L`Yl|RAxO(я ț[@8F5V#,/7>xf9|0Z:x=M\"b!@$%2ybq8}X4LQ#&yB.`VvZ_UJW~$~$~$~$uַe}ՒX6jj A#l_$b$u{fc(2EIQ0R X1&.+9##hTǻiw׊"NN}cH!(#S@fr;#y*qV AVjt`P &BE#8x7qϷק'8,;H'ݨ#; Y-,xr-*yVox{~ʎ5zVBYOL5e۸$pJ+Q^W@t]>reӟOO`&a_[M菧?{Bhm\%Q"rb az=0n |c}Ƈ<0![oNgoA!>v:[P\-M/Og[vyr^z?^_yO;ӓxtg^_e W/~}zBg/gg:b_F_+Fϣ_ē٧7 BG룟S;gNwm 4LAhcе{qq)1EEd;BҐ)bgDΣ7nuxr߼͋~Jccrm}Pz*wN~_ƿ}=^]'fx~Íؿ0apux.~qږ_iJG_\@`MDa_~r Su`=I*ܟ7ꢔW/3z$vWb{TjxR{Ս}uPW~$n`[럍V^C> Bitwed;s4ۃ]󟃟`yAO7ςͥW݋/o/Kjy_0#/#OY:+)=np5SDhB9 SnQ O)$NeJqIٵ@Ӝ)5VJaS^:\{tQ-FGʞݘ1)tJADiDPut0N q&$-XH&xc(F;+ {*=:ިh{AL9&cM7W4Au"0 &C64"5sS:e0RGn &|Ui4}4Q%+):֡$|?~x[՗'%3oߝ9z}ĩT|7z]߽-GèL][IR7Cӹ.U@wL^+v^sθ/m!_oU)WM^w;w*쩽ieuMS9z{ LR5h} yboH\gHkqX)!Y,@ RG-$* قPt~As#RDڴ&(Rr; ˼wT N[I4@.?k4@c4>sÍk3 7A<ٰ%M٘+cѓsv /%*C)fGT[%i>>)BoNfOw7/%RᎹ3K+&.gf3AV0>WRz1<]`SV΃&=gK$ck :w,1&LRC1 /ݪuӽriEjjf/ilj9Vch5!%"X 017H &ZK` jpϝUUxtlCVvSyZ^s -$`e%/3kBRʊAO`X*)1*!%ԧpGa00JzGL>\tXMҕ4Jh\oZwEzWwEzWTՑm:qdVTJ36\%HUG}RZ%^jYWxAx%u,5E, [ZDl(:*R_ZPecԚZ&j\hDP8`5T 0 OT 6xdNqj5H.TjDd= ENȨ# <3*(F E`2Iu*ǂIJº[(U_;XEXE"cF-k.DlMn 5vkJW[Ƕ%c}|?NSHpFeMtF®Eju5LZr 8*`yQF@p< H%ZFch i:^_ D ̴Of`d)"X8pU)c-vVHzDK^;@)kX#`EN "/[-ZHvHtkeSֲF@"tYGS_ן6x!{L$ Zb=!` e,,^`$I|L 8I*5E:hkl0B]#T-0@kZz7o&n^L\oU8zt6,;LbI9t~tM#9S"(bKPDb115-9War)WBIB@ɩ <7gYʄ$9lB\?O]8,L?.)0>9w~k-:EE$M ^IJc輙SrfV޽8 =!i >ehP4#[6<4΅(_\a⩇PXa>*l:uMs;iZiz5 J@NspR6zgQI'%#~hDqr`OmDwIY߾J"ǵa>W')kW}\`.e›Y&D'EH,\fQp7ll_jyNFO*nwI92sMTP$Ә'HIEdgX$` SVĵƩZc鏳î.ݨឃދ!0LJc׿:+gee:]GW{r,kn ZBHo@;˝ (~Dm/9`ӊb6I/Uu4&4{/pR8h&rJ[ 8Mie" 8W <~@3HiΊ]K|q( $Z'7;` dL)=}ކX3o^ED#N E8+R1vqd>(W [ !6""`JR09)9)9`mCS4Yc0 vLY_]%:b0)q@ނyHXN"?P Ujc4{r^k{H8MH<ǒ*!˻H[Kۄ5g; a44b ̝[ ԖrIs+oC*P6 gE^Gpe`!ުt[cDV8"Z蜗4ZkDKC@NF6V^6TB F89,"agEHH,3,Ҹ)ܰHt 4ZS aNPF JUu[3˟I^Z.ztj$$HHg)L*G"$3dNfn36nngm^Eep1?/DQ0S0~ aqR0:mϟ.JUIvUQ~ʭ;dJ*l Y/3&hKCD(8 &y-Q;APk ?rR*[ `vE6۝ggzmfzPd@D%Wƀ@ ckNJΤQXMV+JyuՉ'Lvw/Lv3d7LvGvByxXq\O]# Ô*lH (ʬ6A(RPhzZ.Um?] xn%m" x=mI`.:1 5SL0Az's p8sW Lq3%+Ή ˙f)nK#:=lZH.pVbQܭ6Jh 춡yO]*TP"{7sj%SM8jP !  -:FQJzzCB0dֳKޯd֓YOf=,T7{W| z!=Y\ ]"$SOtF5 D+Iaϖ4z%SCՋ˟q)3?\*sЮ |U#8 R$'LjDLl+Q# ".L btI7EKTYT1n*$N￸4< Ky(A%(aGA°g4Sځ#&s=J7u RI80j{FqE׿j 1 7 H("_H))5O34V( ߇ps}gz6 UꓛǹR|}>ls܇O'RD{"=M\,]\JWB)G\<;:|9+4Y!&ڵ^w|M;h-ހDxNbaTX8SP:rRbв)/Z*g<)Cg8[xڭdoo[Y[Y[YN/|L>{3~W&4|BL.+)pDQ;7ZN ix]+\ 8Rx?f=GΞ9w0\EꣵkWTZ"#; ژ cг1Vn$h߸VnV~?hڏ.koK.kwOW Dڪa$zt T>׏kv ̗kǷ;4oSHkM蝤ஆ &_9 ڀCZth?KVnJĤk$*B,HHV]q*h&N]  9xxT{P⒝6=.T] ؍ّQ[ͺ7;4@lAgZxrwsjboL_\G,:JK: gPZ< O2fIHYgk@Vm5Vm5V}8 tqhVI1-%Q1"KU9UemGAc7vBܫF/`۵{?uy= 1֥W ZPtT1T0bD5$tCUC&)Ky@v3-C.39yTu;o8?I'{doǢ@~ӣO~{%_oÿ]|%|Pϥq|!}ӹ?9G|_#;Dks6<S^ {!7ly)ߞҍ? r}%nTY ܯnKw=BIm7~ʐ{Mn5e}3oڰZ,߭B>q[wJt;BDP}W)!}˵;fxIvow|Hk/6s } >7 0%*O{ˣEx#gw-Y:NZ1tk˿ξK[%foyz>+%g>j~eSy|GV3.J9,ɯW[-m [_NN: 2ڧRsŨbfê\ f,I D 5WͅGS]Y| K?A{3mfp&D|эZ3gˍλAZ3Gm:j ~gms@GT*P-B:W&J#hLBZ24S4 S:Э ǟLcCɀ`v0~;g`[ wmC.X #*ZJP*l1S9i\1'pO|4O|  ' Z|ˍَA0OމcsD]}<ع(;?|k*u@0Gy̦l1N>8A|΃\ cˍgL!ee%rlOCv\&L *1pʧ5N5/ey^)&Wv>WkaYᔹ[tT-DH d$(',C`V{jcTPU 4CuSF̻~Č7*/S5m 9kizRIqh>8=9>H &5`^&Dxw킘t&x!u!ֶMhScb[lY{CD-Ddb1jЬI;*^RN)g:0{LJHgs )10$.P4kqh;g]M$*g6Pb*MeQ REېsLH qJkGxSOQA30!4"u,\*qQظgԤX9y-(!9 4Y1[~sFKt];>8gOVu;=YZWw)u&[8)RJ.WLBY^CJ!yFPtrUGޅIZ&xcOԬ%=yqeJ$>vռ9CYw,Cϩ粮 QᄔS`)ПH$"Gr.Pz, ah<: Wր<;lS_v kg7Zk-SEܔ noxP׻͠r>XҮidK.)^3,ͮ%ڵ;ȸT}PW~9j9c`;l'+8ufuXJJ5}]NI,EeeMkw3zkQivH%Xb,M{cP)$o5//NǢںr('†BJ !+aKQ)3HW6-$Z"i{PiU 14^/*񨼲n_(HJVm(O߄ bRF9&% 2B&u,sln綸%[KUc0IV@BlC 8ϣ1yeݾB=2MG` Rcr(gFrx+T8bU+jRk~ZpByyCW̅svf,mG'?֮eJ8_qS?Ѳj(#HX%JfgхjB퐷uP=ZK0\c^Q=e|BF%`$&Q 4W{ɨ1Zi.+n~Źymi'{=y[hCEHݸSJ):KYA1j}UwBYK$16B+¶Hj:ٖ҃-mL,A!$.eptqW&__;m+}ySs$lˎB&GB6B^$ո>5{aőp Y}$dB@Mѐ GD6B!1_9)ˢrvBX(G}5YgG7Zx\ÏkZKShA/C*m2* &W}V*JQJùR#M$'UVC- zUG\`VEò:ϛ) C1B  E9Zߢ̝#g|no=tCzSm̎HVg3J5I,@%dfј .B\oV9^q :cyOty ]뾢o*GZySs|VbV>&+Ҹ5'kVce{#bL :kZ1=-CqaU/[d}7yJ-Iս8 DHLi Iquv5?HA)ȑHJu[QbX*z<ȫ~r7Qmo O'jXSKTE8+Tp%Smɕnu?ᴆw $Gzy_hir|.smHpM4mc{rTX#,s04:d艃iW*iS|߯`A\-Z 4WhSب NUG]rz_Z !hOoUt9JJSDڊdtNt1(;" /*q@mIYQY qbe&jxș[]ds"%P @HN%D09)Ք4i0~Od: #98:\i _5-|-8]ww)tc ~tnBpXKk3[NUp;ih:(UJ %Ǻr`EK7C{_R@Xf.gj*N 9>6$"FXt~2}ĒM x!z=uN6yy*Q.inƂ6(?gK4$ʻӋޠ'3CX]2Y7O. ڃҞϞW\eBVNoe\l^lf?l^ i(T͛O^26jW4˒6?֥\5.:,C ;#ylmOO ?EOm;UZn}lA(JV'[ ln/7mNry RT:-^edt2z~Ys! h+yHN~et| R弭 0<0Rz>Q;\(-Б3m]21[RSO/t;KO7uci`/_NjC_mg_ٗvHvʎ~1\~ߋȂD%_?pr\nQ|AC љ.ŝ+gK"\x|뛭v!zV';P!㔭SH8s@'a#U 4:Q߸bdL'7?LMYK>2{C}ihFBP - 󂪎.!DvPlPI!L~v!$0|CEɩE'JZ|||BQ@j~AJ ʜz@„ZpB0gm]o:2 WF _Ϯ\oY:~D|v닓*21f[F>,"]6K螱u%$j1q̼(w٧I{wjW <,Q˭"ϋ7Vj0qhNC"fjnL4䬥ADJ?*FތNP3᠁J9 >ghm~5jږ$ڒcz!YI6ӧNS3Ui_< Zp;hKcm$S7m'=O) m:*1Rp7A  y[">|JwU(ة|:pvt_"6>9 S_Kq녭I +l @ŕiJ@待/!gP/;YPyTe>mt)kg紣O8sd}hh2PEd qȠ 1yɀ V}/ =9R"ީ(qP{+Se8T,RӒx`]퍅 B@PS@đ',)vL]tiX;xkTw:YeA0-+Q 9pfVQTB. 8@ϢK[ -SB E?{w%Uڢ9Sɝ\ndfB.-*,2K<^NTjRUj" ʏzkKz4,_RO4~UE>FÜS'tr.Ǣܹ)ц0_פJnѸtpHϰ+0+RA7 p>őEs/0h\|J@5Hi @h49,đܱ7(prW͙pTk:=k- T1Nynѐ%D. :Y[o#VE0phZe :WXtO6y2M߿TK ȱX87j2ֆoQ3P#n$@E;_\@X{1!z[/]F0UDe7EfsGuĮ\ݖ#c?ߢYbq FQB'mI[O?; -~ruّcD7~zW]$(4+əCGf0z(R x(i4vӎ=H ꌒ}x eL!5:IH%Zslt{c)~ȭF~ۆj/|P{УT֛u s&Eش;+?.$z4О,Q >F>'v#`5s)ms mF>Պr&Fgj 1Χ-F]fO]oV_XU5\Qt)B%m6cyz 2L`c@ o4t- ZަoyD; t䅡 aݸd,OD^HnΖZ>A䯿\.R3CdToG uQi`;| 3-72[{uV_-VleE#R0u:NPvZ-5ͷCkzsaV%F*ȣ.5ʼn d'?`{lv#`܌ !5 Wg")c 3Z"G\cB 0!#Đ,i:~ߪoz8_!}6Nq1}K‹e{qV)jHRgHĐaOWUU]#p5xƪfPU(R\WK ç\i `Dqri_uKB9.s^Ō4xn3$97>aV$Tk=J4 a8,$!nIY.LmIf,OLIA !ꕽ+Fu銶{C_rW]Jkg#d5*Q{6DZ_Bt!Di"Y1|(@FcqUJbw'YDc(fBH)Ü Y؄\N}dJrTIDqER<[)y+ $؆!.9kKMwI۴o;LeAlDvT32s1runWK1x a:I\vc73|prPP63sݨȷFRQ6^s ] f f1Hd,Hs[D1^np^/]ot92* 3Ù̘Q&1ޮ{[*(qoMSzҊ?ks,fNVQ,36e1BwbRըHmjk{IlZ#,Qt`T7]v~Nt}HWFD-A PK(be Rm"LrƩ-\r7Ttq,L3%Τ #ñjSQ1^KE-=cQWR8c+ yLG#j 7%M cY [I6<{[-@?g[&yc}%6i@N"nÿF];Ne)2C8YoKR7 1hQ{E*B=m~E,Ag! 1~īEA$׬*q)wf5(sFĽyھsc&IJFT$Dpbx:5W>j5tHA*O+vI{٠#E qV+g4cZj+!&ǦDQfKDtg_ku^nz-XQk񬼛y6nB(?l{E?.$Amčy,ݼ{6O)~/?8)q!̬ P_8Z U:^.kEVai -rݔC*)tG/o|A/g8oDBTHndB 㨅?DBƂ I$QbA9ʪ2\#5`JbJxy*;w3=ր%783bATL(3"טaD:rICe!HlOjO{i,q֬#l]jwWo i$2 $ jѦ,g翜q`2`-׽BG[QUc2e3eqڙ5Ƭkna`X9"#T:TiɑFX' J9ЊUEe7>XRvMTJ?\_({0ft `d*rQjNcߑBuos5 - /:<0LC5t%TЃ4Y`ưȘ􋍟kjjQ/N_%?+ߚ* b?O?.tn{⣂=g~ ^RxzI˧g'ɏh,5ZJXn2|sAlY]kޱk&Jv =ĩ ?|><[ P"[Py3﮽2l5[g'nS~i/c4:E/gV~ٚS_N9J%V-qZ+#`4H+!e)=ǹWZ/43&= }Cywk*Wv /m:&Y5SXbW{!T )^_(O[Ȉك+FK+g^^=]aIkfebOΎ7Ǵi%UA֜JVKܪJHWy쒻=OҪ+&d!٣W#?]Zm?<*Vh[܈x7/_c񓅳&,jw6Pѓ_IYlo߿yGEǻ 5{rh-9z(59_< ZZEyAESg!7q=13B ۳ι˓{u1?wq<<߲HX=v.˽&Xdy?YalhV6R>e"|S0;,tLbΚw<ʞesP؃۫; {6sdfV+cGNT*?r⸛$_EulFK֜")ąKWRe.6Á積 r:!EèІ܀Y`MѮ@U (Y°.զH.R赵R3[΃6C9 vH)\xj| '~ݖ;($ՖPts0޻cFF\Y*5|)@:)$ Rzc/abJdT4X`C- )E \?Z0OG| 쩓X :r|6uq^viimju#Y^q+5A#AR$6b :bh1 cu)()Jt*.Un%r! %T1j Bs,),S iINLp,[g((\Sb$R>[eޤڍU>j6ߘl4ktMFW PvSa ,ߟ߷eMx=ů]Wקm~?An.3vW|g3waKv1ݑ#,de>&>0RJ<84WR)n fH>aHت"Bul^:&IrAS ar61峷XækSi+gk%NU,;XRrw3N~zzlT.LLO: >,ͨ_ٳ>q..-_5'~G} %FD8=)HDp0}R\m(U\HBkU>jg';#M^2xt"n ј b^ 5[pD)ziGՓ6Y"$ I=AF>B 0qŊUtBNwO7mSMItF6U(_zE͵IrE;i4RQG$ڵvªT čW#ֹ=p4EۯX\[ڗ:_uT+|wE&!6g6w:EA@:Ʀ*t&yE9ӘG%2XbKm|֌paěZP6Za!Nl̰*%ՆR\FƳfi)i_YsrW Za8j͂4,nxm1F[`?Q=7o;\:0x)œԳgy]AwLg : 4><-uaǩu]\S' %~<r#iZVY>+'e I3Z2U{=Ab":uQE=yvWKnMH3Z2%⼯hi% 14741ms (14:04:30.527) Jan 22 14:04:30 crc kubenswrapper[4801]: Trace[506423741]: [14.741494217s] [14.741494217s] END Jan 22 14:04:30 crc kubenswrapper[4801]: I0122 14:04:30.527424 4801 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 14:04:30 crc kubenswrapper[4801]: I0122 14:04:30.527835 4801 trace.go:236] Trace[106521572]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 14:04:15.693) (total time: 14834ms): Jan 22 14:04:30 crc kubenswrapper[4801]: Trace[106521572]: ---"Objects listed" error: 14834ms (14:04:30.527) Jan 22 14:04:30 crc kubenswrapper[4801]: Trace[106521572]: [14.834414107s] [14.834414107s] END Jan 22 14:04:30 crc kubenswrapper[4801]: I0122 14:04:30.527863 4801 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 14:04:30 crc kubenswrapper[4801]: I0122 14:04:30.529222 4801 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 22 14:04:30 crc kubenswrapper[4801]: I0122 14:04:30.556685 4801 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 22 14:04:30 crc kubenswrapper[4801]: I0122 14:04:30.569494 4801 csr.go:261] certificate signing request csr-8l7q8 is approved, waiting to be issued Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.083087 4801 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.095681 4801 csr.go:257] certificate signing request csr-8l7q8 is issued Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.131417 4801 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49694->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.131509 4801 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48590->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.131524 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49694->192.168.126.11:17697: read: connection reset by peer" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.131574 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48590->192.168.126.11:17697: read: connection reset by peer" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.132697 4801 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.132733 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.155120 4801 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.155179 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.434291 4801 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 22 14:04:31 crc kubenswrapper[4801]: W0122 14:04:31.434518 4801 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 22 14:04:31 crc kubenswrapper[4801]: W0122 14:04:31.434570 4801 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.434509 4801 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd/events\": read tcp 38.129.56.86:36134->38.129.56.86:6443: use of closed network connection" event="&Event{ObjectMeta:{etcd-crc.188d1294137dba88 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 14:04:15.817939592 +0000 UTC m=+4.519839775,LastTimestamp:2026-01-22 14:04:15.817939592 +0000 UTC m=+4.519839775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 14:04:31 crc kubenswrapper[4801]: W0122 14:04:31.434596 4801 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.507236 4801 apiserver.go:52] "Watching apiserver" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.509692 4801 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.510186 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.510888 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.510914 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.511065 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.511388 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.511508 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.511910 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.512147 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.512458 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.512532 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.513024 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.513046 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.513896 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.514049 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.514192 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.515759 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.515838 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.516118 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.517527 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.523546 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:02:38.165859621 +0000 UTC Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.524827 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534544 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534576 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534594 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534614 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534632 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534650 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534666 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534685 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534704 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534725 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534741 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.534756 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.535557 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.535590 4801 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.536103 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.536313 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.540976 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.541561 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.545435 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.553640 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.553667 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.553678 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.553724 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:32.053708407 +0000 UTC m=+20.755608590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.555914 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.557739 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.557771 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.557786 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.557852 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:32.057833496 +0000 UTC m=+20.759733679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.557880 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.558887 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.561796 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.573768 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.591219 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.612416 4801 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.613121 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.625782 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.626549 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-b2p9x"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.626839 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lstn7"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.626980 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.627497 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gf7j5"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.627518 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.627724 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5t2tp"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.627756 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.628183 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx7sl"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.628311 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.628944 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.629015 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.631112 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.631218 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.631268 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.632054 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.632174 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.632806 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.632921 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.632975 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.633547 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634054 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634075 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634208 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634217 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634313 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634356 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634428 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634587 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634688 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634697 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.634796 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635060 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635102 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635129 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635152 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635177 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635201 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635226 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635246 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635265 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635286 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635306 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635328 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635351 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635371 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635384 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635392 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635466 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635497 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635522 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635548 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635573 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635598 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635624 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635648 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635672 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635698 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635722 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635747 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635772 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635796 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635821 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635844 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635868 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635892 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635919 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635947 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.635994 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636016 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636039 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636062 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636086 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636111 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636133 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636155 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636178 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636290 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636314 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636330 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636345 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636363 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636380 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636395 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636411 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636427 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636441 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636476 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636494 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636511 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636269 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636551 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636869 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636884 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636889 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636897 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636941 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636952 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636962 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.636963 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637010 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637039 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637051 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637083 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637301 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637312 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637147 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637158 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637164 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637431 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637578 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637667 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637689 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637716 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637810 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638141 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638145 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638231 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638291 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638554 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638561 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638581 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638576 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638791 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638824 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638778 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.638872 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.639378 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.639804 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.639867 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640008 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640168 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640253 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640386 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640474 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640636 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640702 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641012 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.640997 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641183 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641207 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641472 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641499 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641560 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641664 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.637102 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641836 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641724 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641838 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.641875 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642017 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642039 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642056 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642074 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642076 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642093 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642109 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642126 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642141 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642158 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642177 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642193 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642209 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642254 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642272 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642289 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642304 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642320 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642339 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642355 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642370 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642386 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642401 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642419 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642436 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642469 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642486 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642510 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642506 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642537 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642546 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642555 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642610 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642637 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642726 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642754 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642781 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642809 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642835 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642857 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642880 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642903 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642928 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642951 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642965 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642970 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.642972 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643227 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643244 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643254 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643289 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643294 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643338 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643360 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643383 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643404 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643424 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643490 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643513 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643536 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643549 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643558 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643581 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643585 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643629 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643689 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643722 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643748 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643770 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643793 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643813 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643835 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643856 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643881 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643903 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643924 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643948 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644024 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644047 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644066 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644087 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644105 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644130 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644150 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644170 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644191 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644211 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644303 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644328 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644349 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644370 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644397 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644418 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644440 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644479 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644501 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644520 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644540 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644562 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644584 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644605 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644626 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644648 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.646717 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649566 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649608 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649631 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649656 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649685 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649709 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649736 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649764 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649790 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649815 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649842 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649867 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649895 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649923 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649949 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649974 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650000 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650034 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650058 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650082 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650106 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650176 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650203 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650229 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650256 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650281 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650307 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650332 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650359 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650386 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650411 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650437 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650846 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650876 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650901 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650925 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651018 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651100 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651139 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651223 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651358 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651378 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651394 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651407 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651421 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651435 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651465 4801 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651478 4801 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651492 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651506 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651522 4801 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651536 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651549 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651562 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651575 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651588 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651601 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651615 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651629 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651643 4801 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651655 4801 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651668 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651681 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651694 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651708 4801 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651721 4801 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651734 4801 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651747 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651762 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651776 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651791 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651804 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651817 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651830 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651843 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651859 4801 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651871 4801 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651884 4801 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651896 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651909 4801 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651922 4801 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651934 4801 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651947 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651991 4801 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652005 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652018 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652031 4801 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652048 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652063 4801 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652077 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652090 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652103 4801 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652116 4801 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652129 4801 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652142 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652155 4801 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652166 4801 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652178 4801 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652190 4801 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652202 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652215 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652227 4801 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.643686 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.656248 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644100 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644187 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644404 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644548 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644692 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.644943 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.645191 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.645209 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.645506 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.645554 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.645793 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.646345 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.646754 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.646938 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.647269 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.647365 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.647741 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.647761 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.647831 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.648565 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.648960 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649190 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649894 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.649293 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650146 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650414 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650519 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650588 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.650948 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651067 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.651324 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.653417 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.654334 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.654662 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.656055 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.656185 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.656209 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.652307 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.656739 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.656756 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.657235 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.657322 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.657650 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.657977 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.658315 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.658423 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.658438 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.658670 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.658898 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659180 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659553 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659809 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659856 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659177 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659918 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.659984 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.660087 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.660163 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.660295 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.660520 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.660747 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.661776 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.661828 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.661930 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.662021 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.662882 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.662921 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664634 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664659 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664698 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664729 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664755 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664783 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664803 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.664997 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.665129 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.666931 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667313 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667408 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667672 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667663 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667730 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667786 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667996 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.667993 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668098 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668221 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668282 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668352 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668526 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668568 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668658 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668666 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668677 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.668808 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.669044 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.669441 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.669515 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.669542 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.669595 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.669844 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.670412 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.670743 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.670843 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:04:32.170825073 +0000 UTC m=+20.872725256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671064 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671151 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671225 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671276 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671466 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671679 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.671804 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.672014 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.672213 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.672653 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.673358 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.673768 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.674429 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.674866 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.674868 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.675300 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.675325 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.675392 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:32.175372384 +0000 UTC m=+20.877272647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.675768 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.675770 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.676030 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.676094 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: E0122 14:04:31.676133 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:32.176122876 +0000 UTC m=+20.878023109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.680913 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.681424 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.689260 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.691399 4801 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367" exitCode=255 Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.692476 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367"} Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.724389 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.724385 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.724619 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.724885 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.725542 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.725558 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.732531 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.737099 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.740241 4801 scope.go:117] "RemoveContainer" containerID="244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.740868 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.741036 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.747543 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.752954 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-env-overrides\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.752990 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753006 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-etc-kubernetes\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753021 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-cnibin\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753036 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82373306-6578-4229-851f-1d80cdabf2d7-cni-binary-copy\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753051 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-netns\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753065 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-kubelet\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753083 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b842046-5300-4281-9d73-3ae42f0d56da-proxy-tls\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753096 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b842046-5300-4281-9d73-3ae42f0d56da-mcd-auth-proxy-config\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753111 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-node-log\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753125 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753145 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvwm2\" (UniqueName: \"kubernetes.io/projected/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-kube-api-access-cvwm2\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753160 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ncw\" (UniqueName: \"kubernetes.io/projected/fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0-kube-api-access-54ncw\") pod \"node-resolver-gf7j5\" (UID: \"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\") " pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753173 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-systemd-units\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753187 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-var-lib-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753210 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzw75\" (UniqueName: \"kubernetes.io/projected/ea1b189f-1574-4157-ac9f-03282964c451-kube-api-access-jzw75\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753243 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-slash\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753264 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-hostroot\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753283 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntz2k\" (UniqueName: \"kubernetes.io/projected/2b842046-5300-4281-9d73-3ae42f0d56da-kube-api-access-ntz2k\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753305 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-script-lib\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753320 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-os-release\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753335 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0-hosts-file\") pod \"node-resolver-gf7j5\" (UID: \"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\") " pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753350 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/82373306-6578-4229-851f-1d80cdabf2d7-multus-daemon-config\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753365 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-netns\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753380 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-systemd\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753415 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea1b189f-1574-4157-ac9f-03282964c451-cni-binary-copy\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753431 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-conf-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753465 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-multus-certs\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753485 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-bin\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753501 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-config\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753514 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovn-node-metrics-cert\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753529 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-kubelet\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753543 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753557 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-log-socket\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753571 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-system-cni-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753586 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-ovn-kubernetes\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753602 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea1b189f-1574-4157-ac9f-03282964c451-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753616 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-cni-multus\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753643 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-cni-bin\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753656 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-etc-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753671 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-cni-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753688 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-cnibin\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753703 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-socket-dir-parent\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753718 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-k8s-cni-cncf-io\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753733 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-netd\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753748 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2b842046-5300-4281-9d73-3ae42f0d56da-rootfs\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753773 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2zn\" (UniqueName: \"kubernetes.io/projected/82373306-6578-4229-851f-1d80cdabf2d7-kube-api-access-vh2zn\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753790 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-ovn\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753805 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-system-cni-dir\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753819 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-os-release\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753854 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753864 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753874 4801 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753883 4801 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753892 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753911 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753921 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753930 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753939 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753948 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753956 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753965 4801 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753973 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753982 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753990 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.753998 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754007 4801 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754015 4801 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754024 4801 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754033 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754042 4801 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754051 4801 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754060 4801 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754070 4801 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754080 4801 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754089 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754098 4801 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754106 4801 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754114 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754122 4801 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754132 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754141 4801 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754150 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754158 4801 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754167 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754176 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754185 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754193 4801 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754202 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754210 4801 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754241 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754253 4801 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754263 4801 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754274 4801 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754285 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754295 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754304 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754313 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754330 4801 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754339 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754349 4801 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754358 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754367 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754375 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754384 4801 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754393 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754401 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754409 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754418 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754426 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754434 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754443 4801 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754467 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754475 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754483 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754491 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754500 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754507 4801 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754516 4801 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754524 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754531 4801 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754540 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754549 4801 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754558 4801 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754566 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754575 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754588 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754597 4801 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754605 4801 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754613 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754621 4801 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754629 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754638 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754646 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754655 4801 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754664 4801 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754672 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754687 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754696 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754704 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754712 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754720 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754728 4801 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754737 4801 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754745 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754753 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754761 4801 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754770 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754778 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754786 4801 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754794 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754804 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754812 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754822 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754831 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754840 4801 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754848 4801 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754856 4801 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754868 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754881 4801 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754893 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754905 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754914 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754923 4801 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754932 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754940 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754951 4801 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754959 4801 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754968 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754977 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754986 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.754995 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755004 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755013 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755022 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755031 4801 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755040 4801 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755050 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755059 4801 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755068 4801 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.755077 4801 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.761299 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.770788 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.788869 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.802479 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.818800 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.821540 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.832252 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.835009 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.842861 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856196 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-env-overrides\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856226 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856247 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-etc-kubernetes\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856263 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82373306-6578-4229-851f-1d80cdabf2d7-cni-binary-copy\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856278 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-netns\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856292 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-kubelet\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856309 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-cnibin\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856325 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b842046-5300-4281-9d73-3ae42f0d56da-proxy-tls\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856339 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b842046-5300-4281-9d73-3ae42f0d56da-mcd-auth-proxy-config\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856360 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ncw\" (UniqueName: \"kubernetes.io/projected/fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0-kube-api-access-54ncw\") pod \"node-resolver-gf7j5\" (UID: \"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\") " pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856376 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-systemd-units\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856389 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-var-lib-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856404 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-node-log\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856418 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856434 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvwm2\" (UniqueName: \"kubernetes.io/projected/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-kube-api-access-cvwm2\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856462 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzw75\" (UniqueName: \"kubernetes.io/projected/ea1b189f-1574-4157-ac9f-03282964c451-kube-api-access-jzw75\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856505 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-slash\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856520 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-hostroot\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856535 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntz2k\" (UniqueName: \"kubernetes.io/projected/2b842046-5300-4281-9d73-3ae42f0d56da-kube-api-access-ntz2k\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856549 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-os-release\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856564 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-script-lib\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856580 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0-hosts-file\") pod \"node-resolver-gf7j5\" (UID: \"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\") " pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856597 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/82373306-6578-4229-851f-1d80cdabf2d7-multus-daemon-config\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856613 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-systemd\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856627 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea1b189f-1574-4157-ac9f-03282964c451-cni-binary-copy\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856644 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-conf-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856667 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-multus-certs\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856683 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-netns\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856699 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-kubelet\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856714 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856729 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-log-socket\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856745 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-bin\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856741 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-node-log\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856760 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-config\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856833 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovn-node-metrics-cert\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856863 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-system-cni-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856885 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-ovn-kubernetes\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856910 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea1b189f-1574-4157-ac9f-03282964c451-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856936 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-cni-multus\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856962 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-cni-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.856983 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-cnibin\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857006 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-socket-dir-parent\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857030 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-k8s-cni-cncf-io\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857053 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-cni-bin\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857077 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-etc-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857101 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-netd\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857123 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2b842046-5300-4281-9d73-3ae42f0d56da-rootfs\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857171 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2zn\" (UniqueName: \"kubernetes.io/projected/82373306-6578-4229-851f-1d80cdabf2d7-kube-api-access-vh2zn\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857194 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-ovn\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857217 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-system-cni-dir\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857239 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-os-release\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857289 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857340 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-config\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857397 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857521 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-os-release\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857739 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-slash\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857767 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-hostroot\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.857900 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-os-release\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.858070 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-env-overrides\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.858333 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-script-lib\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.858389 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0-hosts-file\") pod \"node-resolver-gf7j5\" (UID: \"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\") " pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.858474 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.858527 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-etc-kubernetes\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.858991 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/82373306-6578-4229-851f-1d80cdabf2d7-multus-daemon-config\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859032 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-systemd\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859432 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82373306-6578-4229-851f-1d80cdabf2d7-cni-binary-copy\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859634 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-kubelet\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859676 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-conf-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859696 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-multus-certs\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859702 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-cnibin\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859716 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-netns\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859751 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-system-cni-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.859772 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-ovn-kubernetes\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.860878 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-cni-multus\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.860954 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-k8s-cni-cncf-io\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.860970 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-ovn\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.860988 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-kubelet\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861006 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-run-netns\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861009 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-bin\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861021 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-netd\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861042 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-log-socket\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861050 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-cni-dir\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861057 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-multus-socket-dir-parent\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861067 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861078 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82373306-6578-4229-851f-1d80cdabf2d7-host-var-lib-cni-bin\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861085 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2b842046-5300-4281-9d73-3ae42f0d56da-rootfs\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861090 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-etc-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861097 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-var-lib-openvswitch\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861131 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-cnibin\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861134 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-systemd-units\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861164 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea1b189f-1574-4157-ac9f-03282964c451-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861313 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea1b189f-1574-4157-ac9f-03282964c451-system-cni-dir\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861432 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea1b189f-1574-4157-ac9f-03282964c451-cni-binary-copy\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.861836 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b842046-5300-4281-9d73-3ae42f0d56da-mcd-auth-proxy-config\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.866620 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b842046-5300-4281-9d73-3ae42f0d56da-proxy-tls\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.876075 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovn-node-metrics-cert\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.864108 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.880217 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzw75\" (UniqueName: \"kubernetes.io/projected/ea1b189f-1574-4157-ac9f-03282964c451-kube-api-access-jzw75\") pod \"multus-additional-cni-plugins-lstn7\" (UID: \"ea1b189f-1574-4157-ac9f-03282964c451\") " pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.883104 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvwm2\" (UniqueName: \"kubernetes.io/projected/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-kube-api-access-cvwm2\") pod \"ovnkube-node-nx7sl\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.883137 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ncw\" (UniqueName: \"kubernetes.io/projected/fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0-kube-api-access-54ncw\") pod \"node-resolver-gf7j5\" (UID: \"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\") " pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.884171 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntz2k\" (UniqueName: \"kubernetes.io/projected/2b842046-5300-4281-9d73-3ae42f0d56da-kube-api-access-ntz2k\") pod \"machine-config-daemon-5t2tp\" (UID: \"2b842046-5300-4281-9d73-3ae42f0d56da\") " pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.889432 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2zn\" (UniqueName: \"kubernetes.io/projected/82373306-6578-4229-851f-1d80cdabf2d7-kube-api-access-vh2zn\") pod \"multus-b2p9x\" (UID: \"82373306-6578-4229-851f-1d80cdabf2d7\") " pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.891711 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.904868 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.916049 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.935338 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.945021 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b2p9x" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.958002 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: W0122 14:04:31.961504 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82373306_6578_4229_851f_1d80cdabf2d7.slice/crio-931c966415d6393d0525590b1bc8ae82110283723aeaa61c91269ee769b14ac9 WatchSource:0}: Error finding container 931c966415d6393d0525590b1bc8ae82110283723aeaa61c91269ee769b14ac9: Status 404 returned error can't find the container with id 931c966415d6393d0525590b1bc8ae82110283723aeaa61c91269ee769b14ac9 Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.975926 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:31 crc kubenswrapper[4801]: I0122 14:04:31.994840 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.010124 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.028421 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lstn7" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.033645 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: W0122 14:04:32.046405 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1b189f_1574_4157_ac9f_03282964c451.slice/crio-b32d7693987266c60e6b51bb2494e635e61bfb5205cedff6a26799bae43f8a8a WatchSource:0}: Error finding container b32d7693987266c60e6b51bb2494e635e61bfb5205cedff6a26799bae43f8a8a: Status 404 returned error can't find the container with id b32d7693987266c60e6b51bb2494e635e61bfb5205cedff6a26799bae43f8a8a Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.049125 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gf7j5" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.058682 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.058748 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.058901 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.058929 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.058942 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.058999 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:33.058983873 +0000 UTC m=+21.760884056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.059261 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.059387 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.059491 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.059644 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:33.059621551 +0000 UTC m=+21.761521754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.059363 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.075075 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.079542 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.086890 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.097547 4801 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-22 13:59:31 +0000 UTC, rotation deadline is 2026-10-11 03:51:26.381402167 +0000 UTC Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.097601 4801 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6277h46m54.283803291s for next certificate rotation Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.104236 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.142723 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.169777 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.188160 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.210069 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.241674 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.260776 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.260862 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.260900 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.260972 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.261016 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:33.261004187 +0000 UTC m=+21.962904370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.261072 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:04:33.261065048 +0000 UTC m=+21.962965231 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.261125 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: E0122 14:04:32.261144 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:33.261139501 +0000 UTC m=+21.963039674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.523775 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 07:32:10.328937975 +0000 UTC Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.696323 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.696687 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.696698 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d1e8c73ea69a6ac57dcb1bb35c6caec0d5dc06856e8abba33feec13bbafb4da7"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.697577 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.697618 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8d4aea3e26ac28a1b952c6ff1a1c333af01b06f1e1049c91d25803eb07f40738"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.698926 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerStarted","Data":"da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.698955 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerStarted","Data":"931c966415d6393d0525590b1bc8ae82110283723aeaa61c91269ee769b14ac9"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.700545 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.702367 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.702721 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.703956 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.703991 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.704002 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"af9ec021687a2223f0fb640084a8fe4a91ebe7fd07920353e7580c8144b91c7e"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.705392 4801 generic.go:334] "Generic (PLEG): container finished" podID="ea1b189f-1574-4157-ac9f-03282964c451" containerID="90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a" exitCode=0 Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.705486 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerDied","Data":"90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.705510 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerStarted","Data":"b32d7693987266c60e6b51bb2494e635e61bfb5205cedff6a26799bae43f8a8a"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.707132 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gf7j5" event={"ID":"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0","Type":"ContainerStarted","Data":"34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.707155 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gf7j5" event={"ID":"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0","Type":"ContainerStarted","Data":"1d669207314ffb3d1b78e9fd0b4ca21ce1727033b21bc59fa19a232fb6c00d5c"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.709052 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" exitCode=0 Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.709085 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.709129 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"e45623a049261ffc64fd8636b4c0333cc8ce5a9ec0c994c617626f6b8a550e20"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.710040 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"83d22170c9161855b469477c9ac4d3f39d63c54297958cff647ccf3cb68c6f05"} Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.717782 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.746818 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.765319 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.791170 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.808223 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.826229 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.855246 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.871916 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.887511 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.898057 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.913139 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.936577 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.960503 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.971899 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:32 crc kubenswrapper[4801]: I0122 14:04:32.988470 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:32Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.002960 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.018272 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.030279 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.041820 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.060237 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.070348 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.070409 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070510 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070530 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070540 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070580 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:35.070567815 +0000 UTC m=+23.772467998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070510 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070602 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070608 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.070630 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:35.070624796 +0000 UTC m=+23.772524979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.089057 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.107806 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.124143 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.144505 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.158245 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.168928 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.273120 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.273255 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:04:35.273235517 +0000 UTC m=+23.975135700 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.273283 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.273326 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.273399 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.273413 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.273463 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:35.273439523 +0000 UTC m=+23.975339716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.273489 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:35.273477084 +0000 UTC m=+23.975377267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.524793 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 12:19:45.882725068 +0000 UTC Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.525105 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jnfrc"] Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.525435 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.527522 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.527641 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.527770 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.528959 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.542729 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.555891 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.569417 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.571403 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.571533 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.571615 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.571664 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.571685 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:33 crc kubenswrapper[4801]: E0122 14:04:33.571767 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.575074 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.575966 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.576788 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmcv9\" (UniqueName: \"kubernetes.io/projected/31dbc047-7646-4c7b-a9e8-3b8949ea027d-kube-api-access-kmcv9\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.576834 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31dbc047-7646-4c7b-a9e8-3b8949ea027d-host\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.576895 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/31dbc047-7646-4c7b-a9e8-3b8949ea027d-serviceca\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.577274 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.577942 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.579411 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.580011 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.580724 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.581658 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.582267 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.583193 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.583718 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.584747 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.585251 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.585757 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.586811 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.587325 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.588240 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.588863 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.589600 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.590319 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.590562 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.591328 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.592906 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.593372 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.594383 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.594840 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.595646 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.596817 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.597259 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.598927 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.599789 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.600688 4801 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.600815 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.602385 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.603594 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.604077 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.606196 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.606874 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.607291 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.607970 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.608645 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.609946 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.610522 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.611620 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.612311 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.614575 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.615080 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.616184 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.616773 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.617901 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.618465 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.620015 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.620643 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.621671 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.622242 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.622790 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.625686 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.642218 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.656938 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.673819 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.677846 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/31dbc047-7646-4c7b-a9e8-3b8949ea027d-serviceca\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.677891 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmcv9\" (UniqueName: \"kubernetes.io/projected/31dbc047-7646-4c7b-a9e8-3b8949ea027d-kube-api-access-kmcv9\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.677915 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31dbc047-7646-4c7b-a9e8-3b8949ea027d-host\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.677962 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31dbc047-7646-4c7b-a9e8-3b8949ea027d-host\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.679574 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/31dbc047-7646-4c7b-a9e8-3b8949ea027d-serviceca\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.684486 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.694625 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.702481 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmcv9\" (UniqueName: \"kubernetes.io/projected/31dbc047-7646-4c7b-a9e8-3b8949ea027d-kube-api-access-kmcv9\") pod \"node-ca-jnfrc\" (UID: \"31dbc047-7646-4c7b-a9e8-3b8949ea027d\") " pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.713062 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.716311 4801 generic.go:334] "Generic (PLEG): container finished" podID="ea1b189f-1574-4157-ac9f-03282964c451" containerID="c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd" exitCode=0 Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.716347 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerDied","Data":"c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd"} Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.725812 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.746603 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.759822 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.771294 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.790554 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.803634 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.815635 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.831355 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.845581 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.861016 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.876617 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jnfrc" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.880760 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.893524 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.904311 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.916092 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.925327 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: I0122 14:04:33.959141 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:33Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:33 crc kubenswrapper[4801]: W0122 14:04:33.979571 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31dbc047_7646_4c7b_a9e8_3b8949ea027d.slice/crio-5bf0a934327324569d880218a40d8a2ceb4f6d718c2d0297e684cd8ce42723be WatchSource:0}: Error finding container 5bf0a934327324569d880218a40d8a2ceb4f6d718c2d0297e684cd8ce42723be: Status 404 returned error can't find the container with id 5bf0a934327324569d880218a40d8a2ceb4f6d718c2d0297e684cd8ce42723be Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.342336 4801 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.344362 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.344397 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.344407 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.344524 4801 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.351041 4801 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.351383 4801 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.352707 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.352744 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.352756 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.352774 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.352790 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: E0122 14:04:34.377702 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.381553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.381584 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.381592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.381607 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.381618 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: E0122 14:04:34.394943 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.397920 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.397957 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.397968 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.397984 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.397996 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: E0122 14:04:34.408947 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.411864 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.411910 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.411921 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.411937 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.411951 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: E0122 14:04:34.426993 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.432068 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.432100 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.432110 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.432125 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.432134 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: E0122 14:04:34.443143 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: E0122 14:04:34.443274 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.444608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.444632 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.444640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.444654 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.444666 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.525439 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:41:58.526883423 +0000 UTC Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.547646 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.547691 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.547698 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.547714 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.547726 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.650943 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.650982 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.650993 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.651011 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.651025 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.724156 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jnfrc" event={"ID":"31dbc047-7646-4c7b-a9e8-3b8949ea027d","Type":"ContainerStarted","Data":"1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.724207 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jnfrc" event={"ID":"31dbc047-7646-4c7b-a9e8-3b8949ea027d","Type":"ContainerStarted","Data":"5bf0a934327324569d880218a40d8a2ceb4f6d718c2d0297e684cd8ce42723be"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.727391 4801 generic.go:334] "Generic (PLEG): container finished" podID="ea1b189f-1574-4157-ac9f-03282964c451" containerID="c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055" exitCode=0 Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.727473 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerDied","Data":"c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.733419 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.733791 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.733812 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.733833 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.733849 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.733864 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.753228 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.754509 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.754556 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.754567 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.754592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.754602 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.765572 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.781553 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.795503 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.807277 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.818537 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.835508 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.845157 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.856460 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.858177 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.858210 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.858219 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.858234 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.858245 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.869017 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.879040 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.886003 4801 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.892279 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.905600 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.917771 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.928885 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.941289 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.949858 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.960617 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.960672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.960684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.960703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.960716 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:34Z","lastTransitionTime":"2026-01-22T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.963323 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.977252 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:34 crc kubenswrapper[4801]: I0122 14:04:34.996648 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:34Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.016646 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.031641 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.049436 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.059643 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.063726 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.063754 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.063764 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.063779 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.063787 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.073285 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.090645 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.090923 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.090944 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.090956 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.090989 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:39.090977787 +0000 UTC m=+27.792877970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.090832 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.091149 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.091280 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.091348 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.091401 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.091505 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:39.091492072 +0000 UTC m=+27.793392255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.123108 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.152513 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.166071 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.166112 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.166125 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.166140 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.166151 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.268482 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.268538 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.268561 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.268580 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.268594 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.293675 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.293806 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:04:39.293785864 +0000 UTC m=+27.995686057 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.294180 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.294406 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.294241 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.294894 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:39.294870245 +0000 UTC m=+27.996770468 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.294511 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.295200 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:39.295180744 +0000 UTC m=+27.997080967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.372213 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.372262 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.372276 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.372297 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.372310 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.474911 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.474964 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.474977 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.474998 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.475012 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.525975 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 04:50:53.010905627 +0000 UTC Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.570797 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.570798 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.571271 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.570808 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.571352 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:35 crc kubenswrapper[4801]: E0122 14:04:35.571290 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.577431 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.577720 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.577829 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.577931 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.578060 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.680927 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.681179 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.681438 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.681529 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.681604 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.746511 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.750424 4801 generic.go:334] "Generic (PLEG): container finished" podID="ea1b189f-1574-4157-ac9f-03282964c451" containerID="9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49" exitCode=0 Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.750769 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerDied","Data":"9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.770370 4801 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.770699 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.785225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.785283 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.785300 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.785336 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.785355 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.790895 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.805639 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.822601 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.845647 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.859456 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.888845 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.888872 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.888879 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.888893 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.888902 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.891620 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.907297 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.922692 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.934773 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.948832 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.962356 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.975676 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.991493 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.991532 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.991574 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.991592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.991603 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:35Z","lastTransitionTime":"2026-01-22T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:35 crc kubenswrapper[4801]: I0122 14:04:35.992822 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:35Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.013642 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.029659 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.048200 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.067864 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.082903 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.094807 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.094876 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.094891 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.094933 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.094953 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.098869 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.119151 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.131554 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.145567 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.158664 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.170821 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.196893 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.197218 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.197342 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.197473 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.197572 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.197773 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.239414 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.276575 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.300225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.300398 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.300493 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.300574 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.300636 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.403270 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.403678 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.403881 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.404034 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.404171 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.507482 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.507555 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.507567 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.507585 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.507600 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.526277 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 01:23:10.069621862 +0000 UTC Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.609721 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.609771 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.609787 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.609808 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.609823 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.711583 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.711611 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.711619 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.711632 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.711640 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.755947 4801 generic.go:334] "Generic (PLEG): container finished" podID="ea1b189f-1574-4157-ac9f-03282964c451" containerID="69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae" exitCode=0 Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.756006 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerDied","Data":"69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.763371 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.780201 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.795947 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.813652 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.814064 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.814089 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.814098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.814111 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.814120 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.834110 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.850430 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.864741 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.882716 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.892254 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.904832 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.916944 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.917047 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.917065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.917083 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.917098 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:36Z","lastTransitionTime":"2026-01-22T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.917811 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.928149 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.940947 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.953516 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:36 crc kubenswrapper[4801]: I0122 14:04:36.967191 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.020599 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.020654 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.020668 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.020690 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.020702 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.122431 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.123052 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.123142 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.123228 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.123300 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.235214 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.235400 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.235503 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.235586 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.235644 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.340739 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.341019 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.341172 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.341300 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.341373 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.443220 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.443259 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.443269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.443283 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.443294 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.526694 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 17:43:29.188670181 +0000 UTC Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.545491 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.545551 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.545569 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.545591 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.545607 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.571819 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:37 crc kubenswrapper[4801]: E0122 14:04:37.572031 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.572134 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:37 crc kubenswrapper[4801]: E0122 14:04:37.573106 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.573177 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:37 crc kubenswrapper[4801]: E0122 14:04:37.573311 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.647611 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.647655 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.647663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.647689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.647704 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.750057 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.750105 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.750117 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.750137 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.750148 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.768560 4801 generic.go:334] "Generic (PLEG): container finished" podID="ea1b189f-1574-4157-ac9f-03282964c451" containerID="57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd" exitCode=0 Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.768608 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerDied","Data":"57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.781013 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.794852 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.808298 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.823062 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.834776 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.847317 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.851858 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.851916 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.851926 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.851939 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.851948 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.863258 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.889257 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.906283 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.921330 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.942899 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.954490 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.954826 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.954900 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.954963 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.954983 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:37Z","lastTransitionTime":"2026-01-22T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.955521 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.969668 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:37 crc kubenswrapper[4801]: I0122 14:04:37.985294 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:37Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.058533 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.058587 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.058601 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.058621 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.058632 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.160667 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.160694 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.160703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.160740 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.160750 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.263010 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.263054 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.263065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.263079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.263089 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.278988 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.283477 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.292431 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.296943 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.314839 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.334981 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.350596 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.365367 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.365408 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.365420 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.365435 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.365458 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.370193 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.383966 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.397810 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.412605 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.426317 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.444432 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.458227 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.467971 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.468008 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.468040 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.468055 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.468070 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.471417 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.481540 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.492521 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.505102 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.519462 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.528467 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 22:30:08.202557353 +0000 UTC Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.535068 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.558976 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.572391 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.572543 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.572577 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.572587 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.572609 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.572629 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.592732 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.608144 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.621914 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.634736 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.648242 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.661675 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.675441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.675529 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.675540 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.675561 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.675574 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.685866 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.699954 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.715249 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.730354 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.776354 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" event={"ID":"ea1b189f-1574-4157-ac9f-03282964c451","Type":"ContainerStarted","Data":"9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.779637 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.779681 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.779692 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.779707 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.779720 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.792622 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.805309 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.822526 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.840598 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.853580 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.870556 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.882298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.882339 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.882353 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.882370 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.882385 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.883672 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.904283 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.937888 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.981276 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:38Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.985157 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.985207 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.985228 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.985250 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:38 crc kubenswrapper[4801]: I0122 14:04:38.985267 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:38Z","lastTransitionTime":"2026-01-22T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.020814 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.059946 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.087966 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.088010 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.088018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.088033 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.088042 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.096827 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.133754 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.133847 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134019 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134019 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134050 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134077 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134084 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134098 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134169 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:47.134147211 +0000 UTC m=+35.836047434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.134197 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:47.134184192 +0000 UTC m=+35.836084415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.143752 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.184411 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.190518 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.190619 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.190640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.190696 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.190716 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.293835 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.293895 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.293905 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.293920 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.293931 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.335252 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.335385 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:04:47.335361382 +0000 UTC m=+36.037261575 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.335440 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.335537 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.335580 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.335616 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.335625 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:47.335613739 +0000 UTC m=+36.037513912 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.335649 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:04:47.33563918 +0000 UTC m=+36.037539363 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.397381 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.397427 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.397440 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.397473 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.397486 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.500769 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.500822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.500835 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.500854 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.500871 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.529271 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 22:20:39.845391461 +0000 UTC Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.570964 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.571035 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.571096 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.571180 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.571324 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:39 crc kubenswrapper[4801]: E0122 14:04:39.571465 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.603748 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.603811 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.603827 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.603851 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.603873 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.707446 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.707555 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.707579 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.707602 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.707620 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.787716 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.788304 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.808937 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.817153 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.817345 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.817424 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.817523 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.817556 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.830290 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.833735 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.854047 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.871396 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.919233 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.920646 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.920689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.920703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.920721 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.920734 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:39Z","lastTransitionTime":"2026-01-22T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.938394 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.957177 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.969802 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.979422 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:39 crc kubenswrapper[4801]: I0122 14:04:39.990592 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:39Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.005381 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.023227 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.023514 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.023637 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.023735 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.023868 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.030140 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.042031 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.054436 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.065100 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.076984 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.088781 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.107235 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.119759 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.126382 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.126429 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.126445 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.126494 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.126511 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.133514 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.144902 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.157504 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.169355 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.180508 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.199126 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.215250 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.228212 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.228254 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.228265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.228280 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.228292 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.257901 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.298537 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.330309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.330346 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.330358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.330374 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.330386 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.340217 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.379855 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.432850 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.432898 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.432912 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.432929 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.432943 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.529727 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 10:50:44.22402526 +0000 UTC Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.535160 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.535194 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.535201 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.535214 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.535224 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.641194 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.641270 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.641283 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.641298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.641308 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.744204 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.744241 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.744252 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.744268 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.744278 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.790113 4801 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.790810 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.814677 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.827965 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.840022 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.846865 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.846911 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.846920 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.846935 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.846945 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.860606 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.872238 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.885633 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.897389 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.908380 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.920312 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.932691 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.947766 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.949091 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.949120 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.949131 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.949147 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.949157 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:40Z","lastTransitionTime":"2026-01-22T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.960076 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.973117 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:40 crc kubenswrapper[4801]: I0122 14:04:40.985320 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.001215 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:40Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.022853 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.051679 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.051715 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.051726 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.051742 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.051754 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.094002 4801 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.153843 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.153884 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.153891 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.153905 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.153914 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.256348 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.256404 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.256417 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.256432 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.256443 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.359390 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.359437 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.359467 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.359487 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.359500 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.462595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.462666 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.462682 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.462707 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.462726 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.530924 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 00:43:27.597627714 +0000 UTC Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.565583 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.565652 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.565672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.565701 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.565726 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.570294 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:41 crc kubenswrapper[4801]: E0122 14:04:41.570472 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.570524 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.570559 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:41 crc kubenswrapper[4801]: E0122 14:04:41.570747 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:41 crc kubenswrapper[4801]: E0122 14:04:41.571066 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.595522 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.616076 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.652412 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.669312 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.669365 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.669379 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.669399 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.669413 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.672607 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.689711 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.708373 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.777546 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.777583 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.777592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.777608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.777618 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.778082 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.792941 4801 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.794068 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.809975 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.829651 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.844182 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.856059 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.868410 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.878803 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.881183 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.881246 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.881262 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.881285 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.881312 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.893502 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.985309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.985359 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.985372 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.985390 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:41 crc kubenswrapper[4801]: I0122 14:04:41.985403 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:41Z","lastTransitionTime":"2026-01-22T14:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.092024 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.092070 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.092083 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.092100 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.092110 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.195005 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.195066 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.195092 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.195114 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.195131 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.298409 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.298481 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.298493 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.298508 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.298519 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.401799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.401844 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.401854 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.401871 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.401885 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.504546 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.504600 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.504610 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.504628 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.504643 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.532167 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 14:30:58.314346171 +0000 UTC Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.607489 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.607553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.607567 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.607587 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.607604 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.710928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.710977 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.710987 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.711002 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.711014 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.797544 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/0.log" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.799813 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466" exitCode=1 Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.799859 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.801272 4801 scope.go:117] "RemoveContainer" containerID="b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.815057 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.815229 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.815250 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.815269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.815282 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.821355 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.833373 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.847194 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.863473 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.877778 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.894178 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.909820 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.918613 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.918664 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.918677 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.918699 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.918714 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:42Z","lastTransitionTime":"2026-01-22T14:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.923264 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.936363 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.951407 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.964115 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.983286 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:42 crc kubenswrapper[4801]: I0122 14:04:42.997396 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:42Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.010474 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.022203 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.022259 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.022272 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.022288 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.022639 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.034851 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.125659 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.125711 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.125722 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.125775 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.125805 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.228155 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.228185 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.228194 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.228208 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.228218 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.331540 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.331577 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.331585 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.331601 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.331611 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.434796 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.434837 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.434850 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.434868 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.434879 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.532604 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 18:05:00.977440057 +0000 UTC Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.537944 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.537987 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.538000 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.538019 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.538033 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.570741 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.570756 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:43 crc kubenswrapper[4801]: E0122 14:04:43.570929 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.570765 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:43 crc kubenswrapper[4801]: E0122 14:04:43.571083 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:43 crc kubenswrapper[4801]: E0122 14:04:43.571180 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.641153 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.641202 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.641214 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.641232 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.641243 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.743643 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.743747 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.743764 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.743785 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.743800 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.804583 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/0.log" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.807107 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.807200 4801 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.820114 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.832098 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.846395 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.846434 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.846456 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.846470 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.846480 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.848785 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.859469 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.870431 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.884304 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.899394 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.911543 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.923752 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.931202 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln"] Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.931722 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.932869 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.933327 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.949598 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.949649 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.949662 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.949681 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.949694 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:43Z","lastTransitionTime":"2026-01-22T14:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.950116 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.962013 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.976149 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.986854 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:43 crc kubenswrapper[4801]: I0122 14:04:43.996322 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:43Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.008705 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.030237 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.043991 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.052619 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.052654 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.052666 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.052684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.052697 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.057516 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.073129 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.084755 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.100857 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5bab04-307a-449a-a36d-572d1bb5c66b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.100892 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5bab04-307a-449a-a36d-572d1bb5c66b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.100912 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5bab04-307a-449a-a36d-572d1bb5c66b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.101053 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72x2\" (UniqueName: \"kubernetes.io/projected/6c5bab04-307a-449a-a36d-572d1bb5c66b-kube-api-access-s72x2\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.111385 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.127967 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.145282 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.155743 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.155788 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.155800 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.155822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.155839 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.161761 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.185991 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.198925 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.201799 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5bab04-307a-449a-a36d-572d1bb5c66b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.201845 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5bab04-307a-449a-a36d-572d1bb5c66b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.201867 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5bab04-307a-449a-a36d-572d1bb5c66b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.201973 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72x2\" (UniqueName: \"kubernetes.io/projected/6c5bab04-307a-449a-a36d-572d1bb5c66b-kube-api-access-s72x2\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.202927 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5bab04-307a-449a-a36d-572d1bb5c66b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.203059 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5bab04-307a-449a-a36d-572d1bb5c66b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.210468 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5bab04-307a-449a-a36d-572d1bb5c66b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.215734 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.223223 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72x2\" (UniqueName: \"kubernetes.io/projected/6c5bab04-307a-449a-a36d-572d1bb5c66b-kube-api-access-s72x2\") pod \"ovnkube-control-plane-749d76644c-rbpln\" (UID: \"6c5bab04-307a-449a-a36d-572d1bb5c66b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.229392 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.242722 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.245002 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.259122 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.259328 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.259417 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.259553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.259661 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.262764 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.285385 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.372832 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.373119 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.373209 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.373289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.373367 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.475775 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.476018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.476097 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.476159 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.476213 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.532907 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 23:45:00.002449639 +0000 UTC Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.578143 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.578171 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.578180 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.578192 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.578200 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.680089 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.680382 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.680500 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.680595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.680677 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.779617 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.779663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.779671 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.779688 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.779701 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: E0122 14:04:44.807645 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.812198 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" event={"ID":"6c5bab04-307a-449a-a36d-572d1bb5c66b","Type":"ContainerStarted","Data":"a0931df482a2fbb85f58bc4b20bbc1757e8b71193d7a95185a5688b84b708bad"} Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.812301 4801 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.819020 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.819066 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.819078 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.819099 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.819113 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: E0122 14:04:44.855321 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.863256 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.863286 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.863295 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.863309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.863318 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: E0122 14:04:44.882651 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.886011 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.886062 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.886079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.886099 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.886113 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: E0122 14:04:44.898505 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.901705 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.901738 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.901747 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.901762 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.901773 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:44 crc kubenswrapper[4801]: E0122 14:04:44.913637 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:44Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:44 crc kubenswrapper[4801]: E0122 14:04:44.913794 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.915229 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.915260 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.915274 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.915290 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:44 crc kubenswrapper[4801]: I0122 14:04:44.915302 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:44Z","lastTransitionTime":"2026-01-22T14:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.017979 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.018017 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.018028 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.018043 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.018055 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.120565 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.120792 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.120801 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.120816 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.120826 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.222417 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.222444 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.222466 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.222477 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.222486 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.324785 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.324820 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.324829 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.324844 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.324855 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.427647 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.427688 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.427697 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.427713 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.427726 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.530521 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.530561 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.530570 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.530588 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.530600 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.533657 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 12:07:58.222787386 +0000 UTC Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.571174 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.571211 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.571236 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:45 crc kubenswrapper[4801]: E0122 14:04:45.571316 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:45 crc kubenswrapper[4801]: E0122 14:04:45.571558 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:45 crc kubenswrapper[4801]: E0122 14:04:45.571699 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.633148 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.633567 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.633587 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.633612 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.633629 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.736735 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.736786 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.736805 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.736824 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.736837 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.782604 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ph2s5"] Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.783359 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:45 crc kubenswrapper[4801]: E0122 14:04:45.783480 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.800319 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.819709 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.819752 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/1.log" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.820657 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/0.log" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.824870 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad" exitCode=1 Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.825119 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.825613 4801 scope.go:117] "RemoveContainer" containerID="b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.826249 4801 scope.go:117] "RemoveContainer" containerID="d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad" Jan 22 14:04:45 crc kubenswrapper[4801]: E0122 14:04:45.826471 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.832973 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" event={"ID":"6c5bab04-307a-449a-a36d-572d1bb5c66b","Type":"ContainerStarted","Data":"e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.833028 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" event={"ID":"6c5bab04-307a-449a-a36d-572d1bb5c66b","Type":"ContainerStarted","Data":"55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.839224 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.839254 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.839443 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.839491 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.839516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.839535 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.855246 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.875227 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.887918 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.911160 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.919541 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.919602 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8hr2\" (UniqueName: \"kubernetes.io/projected/18bcc554-da90-40e9-b32f-e0d5d0936faa-kube-api-access-h8hr2\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.926708 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.941014 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.941931 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.941976 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.941987 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.942005 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.942017 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:45Z","lastTransitionTime":"2026-01-22T14:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.956659 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.971791 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:45 crc kubenswrapper[4801]: I0122 14:04:45.988556 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:45Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.001783 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.013831 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.020984 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.021040 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8hr2\" (UniqueName: \"kubernetes.io/projected/18bcc554-da90-40e9-b32f-e0d5d0936faa-kube-api-access-h8hr2\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:46 crc kubenswrapper[4801]: E0122 14:04:46.021146 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:46 crc kubenswrapper[4801]: E0122 14:04:46.021212 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:04:46.521195716 +0000 UTC m=+35.223095909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.035380 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.036713 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8hr2\" (UniqueName: \"kubernetes.io/projected/18bcc554-da90-40e9-b32f-e0d5d0936faa-kube-api-access-h8hr2\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.044133 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.044183 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.044195 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.044215 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.044227 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.046595 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.057905 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.072286 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.083433 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.097358 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.113656 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.127597 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.139597 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.146338 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.146399 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.146417 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.146438 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.146490 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.153406 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.170588 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.180153 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.198782 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.210741 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.222401 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.232599 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.248178 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.248221 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.248234 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.248251 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.248270 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.252370 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"s.LB{}\\\\nI0122 14:04:44.746781 6218 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI0122 14:04:44.746779 6218 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0122 14:04:44.746435 6218 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0122 14:04:44.746485 6218 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.262600 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.272126 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.284162 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.350081 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.350137 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.350156 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.350178 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.350194 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.452702 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.452744 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.452753 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.452768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.452778 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.527734 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:46 crc kubenswrapper[4801]: E0122 14:04:46.527908 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:46 crc kubenswrapper[4801]: E0122 14:04:46.527986 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:04:47.527964505 +0000 UTC m=+36.229864698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.534202 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 11:31:27.318743729 +0000 UTC Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.555631 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.555705 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.555723 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.555748 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.555779 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.659488 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.659552 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.659568 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.659592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.659609 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.707711 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.723854 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.737635 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.747951 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.761960 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.762000 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.762015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.762033 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.762046 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.770826 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.789392 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.800618 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.813633 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.830114 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"s.LB{}\\\\nI0122 14:04:44.746781 6218 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI0122 14:04:44.746779 6218 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0122 14:04:44.746435 6218 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0122 14:04:44.746485 6218 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.838524 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/1.log" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.840926 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.850902 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.863518 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.863971 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.864018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.864031 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.864046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.864057 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.874321 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.886724 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.898791 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.916204 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.927296 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.940004 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:46Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.966667 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.966712 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.966722 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.966736 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:46 crc kubenswrapper[4801]: I0122 14:04:46.966746 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:46Z","lastTransitionTime":"2026-01-22T14:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.070347 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.070426 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.070483 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.070513 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.070535 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.173484 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.173528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.173539 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.173553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.173564 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.233258 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.233333 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233436 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233484 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233485 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233496 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233503 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233515 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233554 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:03.233537675 +0000 UTC m=+51.935437858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.233570 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:03.233563036 +0000 UTC m=+51.935463219 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.277098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.277146 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.277159 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.277176 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.277185 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.379639 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.379699 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.379716 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.379735 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.379746 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.435391 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.435533 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.435584 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:05:03.435558909 +0000 UTC m=+52.137459102 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.435628 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.435648 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.435693 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:03.435682153 +0000 UTC m=+52.137582346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.435823 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.435931 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:03.435904499 +0000 UTC m=+52.137804762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.482695 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.482733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.482742 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.482758 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.482768 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.535251 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 16:20:55.987091642 +0000 UTC Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.537306 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.537660 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.537781 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:04:49.537758645 +0000 UTC m=+38.239658868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.571569 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.571655 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.571762 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.571794 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.571821 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.572003 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.572153 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:47 crc kubenswrapper[4801]: E0122 14:04:47.572329 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.591295 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.591372 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.591524 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.591568 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.591584 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.694106 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.694349 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.694366 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.694383 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.694396 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.796994 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.797029 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.797038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.797051 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.797061 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.899526 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.899580 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.899592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.899609 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:47 crc kubenswrapper[4801]: I0122 14:04:47.899625 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:47Z","lastTransitionTime":"2026-01-22T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.002998 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.003054 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.003069 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.003086 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.003096 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.106406 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.106546 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.106573 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.106604 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.106627 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.209162 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.209212 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.209230 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.209246 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.209257 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.312917 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.312998 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.313034 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.313068 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.313091 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.415663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.415710 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.415722 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.415740 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.415752 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.518009 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.518061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.518083 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.518104 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.518118 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.535395 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 19:27:45.741963913 +0000 UTC Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.621777 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.621836 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.621854 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.621878 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.621898 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.724795 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.724861 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.724890 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.724935 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.724960 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.828790 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.828832 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.828841 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.828859 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.828869 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.931845 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.931886 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.931900 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.931917 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:48 crc kubenswrapper[4801]: I0122 14:04:48.931928 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:48Z","lastTransitionTime":"2026-01-22T14:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.034361 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.034435 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.034510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.034551 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.034576 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.137896 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.137949 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.137964 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.137984 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.138000 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.240150 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.240190 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.240198 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.240213 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.240223 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.342933 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.342986 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.342997 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.343012 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.343021 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.446596 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.446669 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.446683 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.446699 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.446711 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.536084 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 17:43:39.758463227 +0000 UTC Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.549316 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.549352 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.549362 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.549377 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.549389 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.559912 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:49 crc kubenswrapper[4801]: E0122 14:04:49.560012 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:49 crc kubenswrapper[4801]: E0122 14:04:49.560071 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:04:53.560055662 +0000 UTC m=+42.261955845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.570519 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.570619 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.570646 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.570697 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:49 crc kubenswrapper[4801]: E0122 14:04:49.570705 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:49 crc kubenswrapper[4801]: E0122 14:04:49.570785 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:49 crc kubenswrapper[4801]: E0122 14:04:49.570942 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:49 crc kubenswrapper[4801]: E0122 14:04:49.571110 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.652118 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.652175 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.652190 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.652211 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.652227 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.755499 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.755533 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.755546 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.755563 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.755574 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.858005 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.858086 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.858106 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.858134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.858153 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.961344 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.961381 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.961389 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.961404 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:49 crc kubenswrapper[4801]: I0122 14:04:49.961412 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:49Z","lastTransitionTime":"2026-01-22T14:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.065339 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.065403 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.065426 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.065480 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.065505 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.168768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.168834 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.168857 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.168887 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.168911 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.271852 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.271898 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.271906 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.271920 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.271929 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.375087 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.375149 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.375160 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.375177 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.375703 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.479412 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.479468 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.479481 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.479498 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.479509 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.536282 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 11:52:49.77395776 +0000 UTC Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.581609 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.581669 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.581684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.581711 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.581727 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.684217 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.684253 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.684265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.684281 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.684296 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.787246 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.787298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.787313 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.787330 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.787343 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.889831 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.889876 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.889887 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.889903 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.889915 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.993770 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.993833 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.993844 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.993869 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:50 crc kubenswrapper[4801]: I0122 14:04:50.993881 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:50Z","lastTransitionTime":"2026-01-22T14:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.097207 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.097371 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.097390 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.097418 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.097435 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.200120 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.200149 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.200158 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.200172 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.200181 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.303641 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.303694 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.303715 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.303739 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.303756 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.406423 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.406543 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.406566 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.406595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.406613 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.509234 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.509595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.509811 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.509965 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.510172 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.536833 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 18:58:34.036688078 +0000 UTC Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.570691 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.570718 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.570718 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.570776 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:51 crc kubenswrapper[4801]: E0122 14:04:51.570863 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:51 crc kubenswrapper[4801]: E0122 14:04:51.571035 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:51 crc kubenswrapper[4801]: E0122 14:04:51.571200 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:51 crc kubenswrapper[4801]: E0122 14:04:51.571272 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.608505 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.613406 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.613463 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.613476 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.613492 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.613503 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.626375 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.637853 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.655726 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.664758 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.678930 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.690808 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.702186 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.715715 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.715756 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.715799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.715816 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.715827 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.719437 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ac16631a8958b7d13e50f15904cb9840c666238ed6265ca31a0bf63c75e466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:42Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI0122 14:04:41.120144 6082 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120416 6082 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0122 14:04:41.120441 6082 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120649 6082 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0122 14:04:41.120467 6082 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0122 14:04:41.120889 6082 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0122 14:04:41.120914 6082 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0122 14:04:41.120923 6082 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0122 14:04:41.120941 6082 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0122 14:04:41.120478 6082 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0122 14:04:41.120966 6082 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0122 14:04:41.120973 6082 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0122 14:04:41.120980 6082 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0122 14:04:41.121164 6082 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"s.LB{}\\\\nI0122 14:04:44.746781 6218 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI0122 14:04:44.746779 6218 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0122 14:04:44.746435 6218 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0122 14:04:44.746485 6218 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.734044 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.748242 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.761850 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.774939 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.787440 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.800670 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.816320 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.817760 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.817898 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.818046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.818123 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.818215 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.830151 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:51Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.920975 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.921032 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.921049 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.921072 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:51 crc kubenswrapper[4801]: I0122 14:04:51.921089 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:51Z","lastTransitionTime":"2026-01-22T14:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.024489 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.024548 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.024559 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.024576 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.024587 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.127335 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.127381 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.127397 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.127418 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.127435 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.230374 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.230413 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.230424 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.230438 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.230477 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.333018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.333076 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.333094 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.333120 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.333138 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.436605 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.436675 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.436689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.436713 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.436730 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.537520 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 02:28:49.831926739 +0000 UTC Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.539734 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.539786 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.539804 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.539830 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.539847 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.642600 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.642651 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.642664 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.642682 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.642694 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.744989 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.745044 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.745062 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.745079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.745088 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.848420 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.848519 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.848542 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.848569 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.848591 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.951310 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.951391 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.951410 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.951435 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:52 crc kubenswrapper[4801]: I0122 14:04:52.951452 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:52Z","lastTransitionTime":"2026-01-22T14:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.054784 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.054857 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.054880 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.054904 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.054923 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.157955 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.158014 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.158038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.158068 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.158090 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.261295 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.261361 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.261382 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.261425 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.261508 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.365075 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.365130 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.365146 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.365170 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.365218 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.467932 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.467993 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.468006 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.468025 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.468037 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.537986 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 06:01:30.413665971 +0000 UTC Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.570563 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.570683 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:53 crc kubenswrapper[4801]: E0122 14:04:53.570776 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:53 crc kubenswrapper[4801]: E0122 14:04:53.570873 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.570976 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:53 crc kubenswrapper[4801]: E0122 14:04:53.571848 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.572614 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.572760 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.572826 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.572841 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.572865 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: E0122 14:04:53.572853 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.572896 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.599579 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:53 crc kubenswrapper[4801]: E0122 14:04:53.599756 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:53 crc kubenswrapper[4801]: E0122 14:04:53.599831 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:05:01.599811618 +0000 UTC m=+50.301711801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.676973 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.677074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.677099 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.677133 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.677155 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.779802 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.779874 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.779892 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.779916 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.779934 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.883264 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.883323 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.883342 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.883367 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.883384 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.986479 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.986568 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.986589 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.986615 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:53 crc kubenswrapper[4801]: I0122 14:04:53.986632 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:53Z","lastTransitionTime":"2026-01-22T14:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.089803 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.089867 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.089884 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.089911 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.089930 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.193103 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.193168 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.193190 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.193219 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.193241 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.296439 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.296514 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.296525 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.296544 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.296555 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.398872 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.398912 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.398924 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.398941 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.398952 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.501537 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.501591 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.501603 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.501621 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.501634 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.538521 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 03:10:12.808505955 +0000 UTC Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.605274 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.605333 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.605346 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.605370 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.605385 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.708559 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.708600 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.708610 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.708625 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.708636 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.811873 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.811934 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.811943 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.811968 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.811983 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.916281 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.916336 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.916354 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.916378 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:54 crc kubenswrapper[4801]: I0122 14:04:54.916397 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:54Z","lastTransitionTime":"2026-01-22T14:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.020317 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.020497 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.020529 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.020561 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.020580 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.123157 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.123221 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.123243 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.123272 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.123294 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.226377 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.226436 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.226503 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.226537 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.226559 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.313640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.313709 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.313733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.313761 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.313783 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.335973 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.341799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.341846 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.341862 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.341888 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.341905 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.363784 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.369348 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.369390 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.369406 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.369428 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.369446 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.392073 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.397571 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.397615 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.397632 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.397656 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.397673 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.413328 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.417844 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.417882 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.417893 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.417910 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.417924 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.431212 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.431440 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.432957 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.432988 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.433000 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.433014 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.433026 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.442686 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.443505 4801 scope.go:117] "RemoveContainer" containerID="d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.461074 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.478565 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.490515 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.503511 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.515757 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.531018 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.536038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.536079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.536096 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.536145 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.536164 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.539216 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 16:27:04.999212417 +0000 UTC Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.552435 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.568372 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.571369 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.571381 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.571398 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.571561 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.572028 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.572150 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.572274 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:55 crc kubenswrapper[4801]: E0122 14:04:55.572419 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.588637 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.614673 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.628212 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.639246 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.639289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.639305 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.639327 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.639344 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.642304 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.655373 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.667356 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.691953 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"s.LB{}\\\\nI0122 14:04:44.746781 6218 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI0122 14:04:44.746779 6218 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0122 14:04:44.746435 6218 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0122 14:04:44.746485 6218 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.703810 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.714946 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.742208 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.742267 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.742283 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.742302 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.742315 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.844312 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.844352 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.844364 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.844382 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.844393 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.873106 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/1.log" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.875774 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.876203 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.887091 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.898737 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.908335 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.919851 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.930216 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.940618 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.946241 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.946288 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.946298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.946311 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.946320 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:55Z","lastTransitionTime":"2026-01-22T14:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.954347 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.968996 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.983293 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:55 crc kubenswrapper[4801]: I0122 14:04:55.993235 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:55Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.016717 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.047059 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.048863 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.048898 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.048908 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.048924 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.048934 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.060935 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.077987 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"s.LB{}\\\\nI0122 14:04:44.746781 6218 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI0122 14:04:44.746779 6218 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0122 14:04:44.746435 6218 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0122 14:04:44.746485 6218 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.088113 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.096801 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.108234 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.150858 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.150892 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.150900 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.150914 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.150922 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.253202 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.253237 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.253246 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.253259 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.253269 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.356592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.356659 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.356684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.356714 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.356736 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.459418 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.459537 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.459553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.459580 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.459601 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.540096 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 00:10:43.624868446 +0000 UTC Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.566187 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.566248 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.566262 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.566297 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.566311 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.669095 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.669432 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.669480 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.669499 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.669511 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.772671 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.772712 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.772721 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.772737 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.772745 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.875861 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.875944 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.875965 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.875994 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.876014 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.882052 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/2.log" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.883237 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/1.log" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.889697 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60" exitCode=1 Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.889750 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.889787 4801 scope.go:117] "RemoveContainer" containerID="d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.891595 4801 scope.go:117] "RemoveContainer" containerID="1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60" Jan 22 14:04:56 crc kubenswrapper[4801]: E0122 14:04:56.894254 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.910986 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.931606 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.946282 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.962527 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.978751 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.978795 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.978811 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.978868 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.978885 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:56Z","lastTransitionTime":"2026-01-22T14:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.981002 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d571d6bb2b83f46c7de85ee0d1141e17fbcca4548aea6862448d4fdf206fccad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"s.LB{}\\\\nI0122 14:04:44.746781 6218 services_controller.go:453] Built service openshift-kube-scheduler/scheduler template LB for network=default: []services.LB{}\\\\nI0122 14:04:44.746779 6218 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0122 14:04:44.746435 6218 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0122 14:04:44.746485 6218 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization,\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:56 crc kubenswrapper[4801]: I0122 14:04:56.992647 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.006951 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.020945 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.035738 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.051837 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.071739 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.082195 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.082254 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.082272 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.082297 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.082315 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.085616 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.107415 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.122706 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.137525 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.156008 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.170301 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.185348 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.185406 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.185418 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.185436 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.185483 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.289046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.289100 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.289114 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.289133 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.289146 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.391672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.391717 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.391728 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.391744 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.391758 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.494375 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.494431 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.494461 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.494485 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.494502 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.540529 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 03:52:40.079919527 +0000 UTC Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.571518 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.571675 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:04:57 crc kubenswrapper[4801]: E0122 14:04:57.571890 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.571991 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.572013 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:04:57 crc kubenswrapper[4801]: E0122 14:04:57.572191 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:04:57 crc kubenswrapper[4801]: E0122 14:04:57.572509 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:04:57 crc kubenswrapper[4801]: E0122 14:04:57.572603 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.597907 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.597977 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.597997 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.598030 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.598050 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.702424 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.702906 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.703001 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.703723 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.703852 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.807539 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.807592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.807608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.807632 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.807648 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.894302 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/2.log" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.897979 4801 scope.go:117] "RemoveContainer" containerID="1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60" Jan 22 14:04:57 crc kubenswrapper[4801]: E0122 14:04:57.898305 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.911268 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.911355 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.911379 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.911433 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.911491 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:57Z","lastTransitionTime":"2026-01-22T14:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.917087 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.933255 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.948041 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.977895 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:57 crc kubenswrapper[4801]: I0122 14:04:57.995330 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:57Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.008599 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.013894 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.013926 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.013937 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.013955 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.013968 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.021936 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.041468 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.055743 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.069971 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.082434 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.094033 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.104512 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.113682 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.116238 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.116287 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.116299 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.116317 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.116367 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.124902 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.134901 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.144089 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:58Z is after 2025-08-24T17:21:41Z" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.218722 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.218762 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.218772 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.218787 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.218797 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.321129 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.321181 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.321200 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.321219 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.321232 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.424046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.424103 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.424119 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.424140 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.424154 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.526859 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.526901 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.526910 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.526925 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.526934 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.541331 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 13:37:53.555077659 +0000 UTC Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.630035 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.630074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.630082 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.630096 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.630107 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.732365 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.732433 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.732492 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.732525 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.732552 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.834635 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.834667 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.834676 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.834690 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.834701 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.937289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.937358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.937376 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.937421 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:58 crc kubenswrapper[4801]: I0122 14:04:58.937436 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:58Z","lastTransitionTime":"2026-01-22T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.039983 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.040042 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.040050 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.040062 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.040070 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:59Z","lastTransitionTime":"2026-01-22T14:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.142303 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.142420 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.142435 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.142484 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.142499 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:59Z","lastTransitionTime":"2026-01-22T14:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.246058 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.246126 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.246143 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.246164 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.246175 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:59Z","lastTransitionTime":"2026-01-22T14:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.348972 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.349013 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.349024 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.349041 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:04:59 crc kubenswrapper[4801]: I0122 14:04:59.349052 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:04:59Z","lastTransitionTime":"2026-01-22T14:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.118607 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:59:07.317297453 +0000 UTC Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.118759 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:00 crc kubenswrapper[4801]: E0122 14:05:00.118852 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.120162 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.120287 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.120221 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:00 crc kubenswrapper[4801]: E0122 14:05:00.120526 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:00 crc kubenswrapper[4801]: E0122 14:05:00.120527 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:00 crc kubenswrapper[4801]: E0122 14:05:00.120701 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.122319 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.122380 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.122392 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.122411 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.122424 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.224175 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.224668 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.224737 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.224806 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.224873 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.327261 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.327336 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.327357 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.327375 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.327386 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.430082 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.430505 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.430641 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.430787 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.430912 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.534098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.534147 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.534161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.534178 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.534190 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.636254 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.636291 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.636302 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.636318 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.636330 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.739064 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.739116 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.739131 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.739153 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.739168 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.842714 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.842771 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.842795 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.842823 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.842846 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.945979 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.946079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.946098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.946537 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:00 crc kubenswrapper[4801]: I0122 14:05:00.946563 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:00Z","lastTransitionTime":"2026-01-22T14:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.049335 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.049380 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.049391 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.049408 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.049464 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.119839 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 13:20:06.285648274 +0000 UTC Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.151979 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.152015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.152025 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.152043 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.152054 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.255401 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.255479 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.255499 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.255535 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.255570 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.358150 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.358210 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.358225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.358251 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.358266 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.461600 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.461665 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.461680 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.461703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.461718 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.564552 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.564595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.564608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.564624 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.564635 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.570684 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.570748 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:01 crc kubenswrapper[4801]: E0122 14:05:01.570795 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:01 crc kubenswrapper[4801]: E0122 14:05:01.570895 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.570981 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:01 crc kubenswrapper[4801]: E0122 14:05:01.571189 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.571211 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:01 crc kubenswrapper[4801]: E0122 14:05:01.571341 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.591956 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.623834 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.637673 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:01 crc kubenswrapper[4801]: E0122 14:05:01.638054 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:05:01 crc kubenswrapper[4801]: E0122 14:05:01.638407 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:05:17.638208373 +0000 UTC m=+66.340108596 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.638514 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.652982 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.667849 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.667891 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.667903 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.667921 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.667935 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.670800 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.690393 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.705943 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.721759 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.741159 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.756785 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.771928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.771975 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.771993 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.772016 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.772033 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.772938 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.791241 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.809310 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.822908 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.846505 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.859583 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.873740 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:01Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.874480 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.874510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.874522 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.874537 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.874547 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.977067 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.977097 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.977106 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.977119 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:01 crc kubenswrapper[4801]: I0122 14:05:01.977128 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:01Z","lastTransitionTime":"2026-01-22T14:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.079786 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.080323 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.080411 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.080530 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.080621 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.120257 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 01:44:17.123039294 +0000 UTC Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.183528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.183581 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.183702 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.183728 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.183745 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.285394 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.285469 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.285489 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.285511 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.285525 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.388504 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.388557 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.388570 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.388595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.388609 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.491609 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.491652 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.491662 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.491678 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.491690 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.594183 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.594216 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.594225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.594238 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.594246 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.697398 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.697515 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.697539 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.697564 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.697582 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.800701 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.800748 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.800759 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.800777 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.800789 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.903490 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.903548 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.903565 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.903591 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:02 crc kubenswrapper[4801]: I0122 14:05:02.903608 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:02Z","lastTransitionTime":"2026-01-22T14:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.006222 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.006269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.006286 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.006310 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.006327 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.109056 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.109118 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.109158 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.109219 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.109245 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.120613 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 18:42:05.007458075 +0000 UTC Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.212931 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.213015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.213050 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.213080 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.213105 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.255279 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.255371 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255597 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255609 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255628 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255644 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255651 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255663 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255732 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:35.255710291 +0000 UTC m=+83.957610514 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.255762 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:35.255750162 +0000 UTC m=+83.957650385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.316571 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.316634 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.316655 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.316680 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.316697 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.418754 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.418809 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.418818 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.418832 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.418841 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.456882 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.456979 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.457069 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:05:35.457038294 +0000 UTC m=+84.158938487 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.457123 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.457203 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.457210 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:35.457193289 +0000 UTC m=+84.159093472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.457238 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:05:35.45723072 +0000 UTC m=+84.159130903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.457147 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.466765 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.475546 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.479304 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.489165 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.502487 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.515152 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.520778 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.520810 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.520820 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.520836 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.520847 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.527400 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.548354 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.563194 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.570606 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.570643 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.570679 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.570761 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.570825 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.570924 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.571086 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:03 crc kubenswrapper[4801]: E0122 14:05:03.571348 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.577904 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.588934 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.600240 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.611392 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.621697 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.622693 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.622730 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.622741 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.622756 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.622765 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.640430 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.651724 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.663143 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.676857 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.689343 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:03Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.725379 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.725665 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.725759 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.725849 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.725919 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.829476 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.829863 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.830012 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.830164 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.830352 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.933324 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.933381 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.933398 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.933422 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:03 crc kubenswrapper[4801]: I0122 14:05:03.933442 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:03Z","lastTransitionTime":"2026-01-22T14:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.036098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.036415 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.036628 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.036850 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.037038 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.121613 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 20:41:04.400559351 +0000 UTC Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.139112 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.139173 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.139190 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.139221 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.139239 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.242004 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.242044 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.242057 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.242074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.242084 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.344766 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.344849 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.344896 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.344919 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.344936 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.447641 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.447733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.447848 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.447883 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.447906 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.551324 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.551390 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.551403 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.551429 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.551444 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.654909 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.654961 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.654974 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.654997 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.655013 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.758017 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.758081 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.758103 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.758128 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.758144 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.861606 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.861685 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.861703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.861729 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.861745 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.964530 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.964576 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.964587 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.964602 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:04 crc kubenswrapper[4801]: I0122 14:05:04.964613 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:04Z","lastTransitionTime":"2026-01-22T14:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.066737 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.066803 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.066824 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.066849 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.066867 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.122073 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 14:29:54.231663606 +0000 UTC Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.170045 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.170109 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.170126 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.170152 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.170170 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.273032 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.273074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.273085 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.273100 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.273109 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.376658 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.376757 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.376772 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.376790 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.376803 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.480791 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.480843 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.480853 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.480877 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.480889 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.556802 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.556850 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.556860 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.556877 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.556889 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.570955 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.571141 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.571682 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.571756 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.571914 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.571920 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.571779 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.571977 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.573603 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:05Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.580365 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.580562 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.580816 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.580938 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.581061 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.596971 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:05Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.601569 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.601631 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.601648 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.601673 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.601692 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.619622 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:05Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.624262 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.624314 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.624328 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.624347 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.624361 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.642981 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:05Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.648417 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.648483 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.648495 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.648513 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.648525 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.661157 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:05Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:05 crc kubenswrapper[4801]: E0122 14:05:05.661306 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.663643 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.663682 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.663694 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.663712 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.663725 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.767029 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.767339 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.767405 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.767525 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.767605 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.870874 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.870973 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.870992 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.871017 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.871036 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.974161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.974245 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.974269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.974298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:05 crc kubenswrapper[4801]: I0122 14:05:05.974325 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:05Z","lastTransitionTime":"2026-01-22T14:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.077033 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.077386 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.077564 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.077762 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.077904 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.122714 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 00:10:54.976059362 +0000 UTC Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.181219 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.181725 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.181950 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.182183 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.182424 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.285286 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.285361 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.285373 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.285397 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.285410 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.388425 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.388493 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.388509 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.388534 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.388551 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.491701 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.491750 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.491766 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.491789 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.491805 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.594386 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.594534 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.594563 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.594640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.594674 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.697412 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.697478 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.697487 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.697500 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.697556 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.800401 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.800472 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.800484 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.800501 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.800512 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.902443 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.902781 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.902870 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.902956 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:06 crc kubenswrapper[4801]: I0122 14:05:06.903048 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:06Z","lastTransitionTime":"2026-01-22T14:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.005058 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.005322 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.005397 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.005510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.005594 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.108500 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.108658 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.108724 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.108799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.108871 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.122972 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:26:39.822784937 +0000 UTC Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.211408 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.211494 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.211507 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.211528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.211540 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.314156 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.314198 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.314210 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.314230 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.314245 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.416882 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.417032 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.417056 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.417084 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.417102 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.520857 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.520922 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.520956 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.520985 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.521006 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.571762 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.571885 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.571809 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:07 crc kubenswrapper[4801]: E0122 14:05:07.571992 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.572247 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:07 crc kubenswrapper[4801]: E0122 14:05:07.572233 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:07 crc kubenswrapper[4801]: E0122 14:05:07.572417 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:07 crc kubenswrapper[4801]: E0122 14:05:07.572586 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.624121 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.624201 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.624225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.624256 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.624280 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.727623 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.727727 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.727753 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.727783 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.727805 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.830870 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.830932 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.830945 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.830964 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.830977 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.934250 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.935181 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.935229 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.935264 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:07 crc kubenswrapper[4801]: I0122 14:05:07.935303 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:07Z","lastTransitionTime":"2026-01-22T14:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.038269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.038335 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.038353 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.038376 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.038393 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.123417 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 13:34:50.902632214 +0000 UTC Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.141009 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.141050 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.141063 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.141078 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.141088 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.244414 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.244559 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.244589 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.244615 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.244633 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.346867 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.346900 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.346908 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.346921 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.346931 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.449721 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.449783 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.449797 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.449895 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.449907 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.552571 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.552626 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.552640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.552658 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.552671 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.571664 4801 scope.go:117] "RemoveContainer" containerID="1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60" Jan 22 14:05:08 crc kubenswrapper[4801]: E0122 14:05:08.571863 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.655510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.655578 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.655601 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.655636 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.655659 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.758499 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.758580 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.758594 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.758613 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.758623 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.863070 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.863160 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.863257 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.863344 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.863365 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.966681 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.966749 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.966769 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.966797 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:08 crc kubenswrapper[4801]: I0122 14:05:08.966818 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:08Z","lastTransitionTime":"2026-01-22T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.069186 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.069233 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.069241 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.069254 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.069278 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.124093 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 09:02:47.529197606 +0000 UTC Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.171793 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.171857 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.171877 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.171904 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.171923 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.275713 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.275794 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.275818 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.275857 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.275880 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.378536 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.378597 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.378611 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.378633 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.378648 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.481683 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.481768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.481792 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.481821 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.481844 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.571119 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.571234 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.571287 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.571164 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:09 crc kubenswrapper[4801]: E0122 14:05:09.571365 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:09 crc kubenswrapper[4801]: E0122 14:05:09.571545 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:09 crc kubenswrapper[4801]: E0122 14:05:09.571668 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:09 crc kubenswrapper[4801]: E0122 14:05:09.571779 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.584015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.584075 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.584088 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.584112 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.584127 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.687815 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.687875 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.687891 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.687909 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.687921 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.790669 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.790740 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.790764 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.790798 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.790821 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.893931 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.893971 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.894011 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.894029 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.894040 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.996024 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.996069 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.996079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.996096 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:09 crc kubenswrapper[4801]: I0122 14:05:09.996111 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:09Z","lastTransitionTime":"2026-01-22T14:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.098773 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.098826 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.098843 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.098868 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.098891 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.124372 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 20:17:28.210650737 +0000 UTC Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.201625 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.201677 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.201689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.201709 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.201729 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.305631 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.305692 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.305702 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.305722 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.305733 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.408822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.408939 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.408957 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.408981 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.409000 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.511636 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.511720 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.511766 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.511802 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.511834 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.615182 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.615226 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.615235 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.615249 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.615261 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.718292 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.718358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.718381 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.718407 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.718425 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.822034 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.822106 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.822132 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.822164 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.822188 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.924909 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.924939 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.924948 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.924962 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:10 crc kubenswrapper[4801]: I0122 14:05:10.924971 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:10Z","lastTransitionTime":"2026-01-22T14:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.027639 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.027688 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.027697 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.027711 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.027721 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.125503 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:56:28.083275422 +0000 UTC Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.130105 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.130130 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.130141 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.130154 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.130163 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.232173 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.232205 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.232215 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.232228 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.232238 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.333994 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.334054 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.334070 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.334094 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.334113 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.436560 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.436597 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.436605 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.436639 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.436650 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.538972 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.539048 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.539070 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.539101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.539123 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.570636 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.570643 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.570694 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:11 crc kubenswrapper[4801]: E0122 14:05:11.570836 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.570942 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:11 crc kubenswrapper[4801]: E0122 14:05:11.571048 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:11 crc kubenswrapper[4801]: E0122 14:05:11.571108 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:11 crc kubenswrapper[4801]: E0122 14:05:11.571292 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.583922 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.602644 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.622327 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.642159 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.642208 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.642224 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.642247 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.642266 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.646180 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.659404 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.675208 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.687653 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.698615 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.709799 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.720510 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.733021 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.744440 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.744869 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.744893 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.744901 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.744913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.744922 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.762762 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.775593 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.785429 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.795354 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.803663 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.812868 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:11Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.847241 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.847548 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.847738 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.847949 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.848136 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.950992 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.951029 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.951040 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.951056 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:11 crc kubenswrapper[4801]: I0122 14:05:11.951066 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:11Z","lastTransitionTime":"2026-01-22T14:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.053528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.053844 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.053933 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.054061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.054168 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.125866 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 09:57:31.331863973 +0000 UTC Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.157387 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.157433 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.157444 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.157492 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.157505 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.260479 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.260521 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.260531 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.260546 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.260558 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.363887 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.364494 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.364524 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.364598 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.364627 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.467966 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.468065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.468088 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.468112 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.468132 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.571010 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.571065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.571081 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.571101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.571116 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.674251 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.674303 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.674317 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.674338 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.674353 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.779402 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.779473 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.779486 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.779503 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.779516 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.882559 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.882597 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.882608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.882624 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.882636 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.985643 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.985710 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.985728 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.985752 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:12 crc kubenswrapper[4801]: I0122 14:05:12.985769 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:12Z","lastTransitionTime":"2026-01-22T14:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.089330 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.089382 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.089393 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.089410 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.089421 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.126853 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 12:00:40.792701178 +0000 UTC Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.192146 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.192193 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.192207 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.192230 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.192244 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.295019 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.295084 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.295102 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.295138 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.295157 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.397321 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.397380 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.397393 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.397411 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.397423 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.500417 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.500488 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.500501 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.500517 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.500533 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.570475 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.570603 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:13 crc kubenswrapper[4801]: E0122 14:05:13.570654 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.570475 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.570884 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:13 crc kubenswrapper[4801]: E0122 14:05:13.570787 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:13 crc kubenswrapper[4801]: E0122 14:05:13.571040 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:13 crc kubenswrapper[4801]: E0122 14:05:13.571211 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.603328 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.603388 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.603402 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.603426 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.603483 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.706394 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.706438 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.706465 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.706484 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.706495 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.809813 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.809858 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.809871 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.809888 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.809900 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.912724 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.912791 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.912815 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.912845 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:13 crc kubenswrapper[4801]: I0122 14:05:13.912867 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:13Z","lastTransitionTime":"2026-01-22T14:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.015853 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.015896 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.015912 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.015928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.015938 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.119231 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.119276 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.119287 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.119306 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.119320 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.127555 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 17:33:16.332112153 +0000 UTC Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.221265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.221309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.221321 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.221341 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.221354 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.324040 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.324078 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.324089 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.324104 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.324116 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.426485 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.426566 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.426579 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.426595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.426604 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.529324 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.529391 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.529418 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.529488 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.529513 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.632303 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.632356 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.632365 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.632381 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.632392 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.736015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.736074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.736092 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.736113 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.736128 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.839533 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.839578 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.839588 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.839605 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.839616 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.942861 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.942915 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.942927 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.942947 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:14 crc kubenswrapper[4801]: I0122 14:05:14.942962 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:14Z","lastTransitionTime":"2026-01-22T14:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.045889 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.045951 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.045968 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.045989 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.046007 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.128668 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:35:24.861506643 +0000 UTC Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.148289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.148330 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.148342 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.148362 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.148375 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.250475 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.250553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.250566 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.250584 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.250599 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.352739 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.352786 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.352796 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.352812 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.352823 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.457222 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.457255 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.457263 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.457276 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.457287 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.560264 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.560296 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.560305 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.560319 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.560330 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.570757 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.570850 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:15 crc kubenswrapper[4801]: E0122 14:05:15.571037 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.571060 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:15 crc kubenswrapper[4801]: E0122 14:05:15.571123 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:15 crc kubenswrapper[4801]: E0122 14:05:15.571206 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.571084 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:15 crc kubenswrapper[4801]: E0122 14:05:15.571349 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.662413 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.662469 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.662479 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.662495 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.662507 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.764775 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.764819 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.764830 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.764845 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.764855 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.867699 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.867746 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.867759 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.867776 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.867787 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.969872 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.969932 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.969942 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.969954 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:15 crc kubenswrapper[4801]: I0122 14:05:15.969965 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:15Z","lastTransitionTime":"2026-01-22T14:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.024963 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.025011 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.025026 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.025041 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.025050 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: E0122 14:05:16.037060 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:16Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.040834 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.040890 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.040907 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.040928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.040944 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: E0122 14:05:16.053096 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:16Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.057046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.057092 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.057103 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.057151 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.057171 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: E0122 14:05:16.069026 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:16Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.072555 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.072590 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.072600 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.072614 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.072623 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: E0122 14:05:16.083795 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:16Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.087291 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.087322 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.087330 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.087346 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.087354 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: E0122 14:05:16.097338 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:16Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:16 crc kubenswrapper[4801]: E0122 14:05:16.097586 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.099265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.099304 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.099317 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.099334 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.099345 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.129649 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 22:17:41.50300761 +0000 UTC Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.202384 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.202427 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.202439 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.202469 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.202482 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.305474 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.305506 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.305514 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.305528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.305538 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.408199 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.408252 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.408268 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.408291 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.408307 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.510653 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.510691 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.510702 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.510719 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.510731 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.613528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.613601 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.613619 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.613644 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.613663 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.715992 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.716030 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.716042 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.716059 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.716073 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.823976 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.824011 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.824019 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.824034 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.824043 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.926371 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.926400 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.926409 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.926423 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:16 crc kubenswrapper[4801]: I0122 14:05:16.926432 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:16Z","lastTransitionTime":"2026-01-22T14:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.029135 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.029178 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.029189 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.029206 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.029217 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.130198 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 10:11:13.74854971 +0000 UTC Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.132019 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.132070 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.132082 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.132101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.132112 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.235405 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.235497 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.235516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.235539 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.235556 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.338245 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.338289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.338299 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.338315 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.338327 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.441084 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.441174 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.441193 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.441213 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.441226 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.543300 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.543336 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.543348 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.543363 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.543375 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.571329 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:17 crc kubenswrapper[4801]: E0122 14:05:17.571435 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.571518 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:17 crc kubenswrapper[4801]: E0122 14:05:17.571589 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.572129 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:17 crc kubenswrapper[4801]: E0122 14:05:17.572202 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.572337 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:17 crc kubenswrapper[4801]: E0122 14:05:17.572406 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.645889 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.645930 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.645940 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.645956 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.645967 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.708850 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:17 crc kubenswrapper[4801]: E0122 14:05:17.709084 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:05:17 crc kubenswrapper[4801]: E0122 14:05:17.709175 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:05:49.709150575 +0000 UTC m=+98.411050838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.748236 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.748284 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.748300 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.748316 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.748328 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.850338 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.850414 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.850433 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.850476 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.850495 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.952640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.952681 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.952693 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.952708 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:17 crc kubenswrapper[4801]: I0122 14:05:17.952721 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:17Z","lastTransitionTime":"2026-01-22T14:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.055081 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.055113 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.055126 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.055143 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.055155 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.130493 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 23:52:31.386027064 +0000 UTC Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.157582 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.157685 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.157712 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.157752 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.157783 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.187950 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/0.log" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.188003 4801 generic.go:334] "Generic (PLEG): container finished" podID="82373306-6578-4229-851f-1d80cdabf2d7" containerID="da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074" exitCode=1 Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.188036 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerDied","Data":"da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.188400 4801 scope.go:117] "RemoveContainer" containerID="da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.200907 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.214648 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.227670 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.242109 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.255971 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.260580 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.260660 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.260687 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.260719 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.260743 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.266466 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.288657 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.303680 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.318323 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.329990 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.347901 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.364051 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.365754 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.365802 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.365812 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.365826 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.365848 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.374214 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.387751 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.399895 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.410572 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.419875 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.431507 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:18Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.468596 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.468638 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.468654 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.468670 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.468680 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.570776 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.570814 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.570822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.570833 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.570842 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.672894 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.672930 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.672938 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.672950 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.672959 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.774915 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.774947 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.774956 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.774970 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.774984 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.877003 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.877046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.877057 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.877073 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.877084 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.978569 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.978601 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.978612 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.978627 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:18 crc kubenswrapper[4801]: I0122 14:05:18.978639 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:18Z","lastTransitionTime":"2026-01-22T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.080153 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.080191 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.080202 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.080217 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.080228 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.130687 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 15:46:15.264252069 +0000 UTC Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.182115 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.182150 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.182161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.182176 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.182187 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.192117 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/0.log" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.192173 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerStarted","Data":"bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.203517 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.213532 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.225803 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.247946 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.265083 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.282302 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.283793 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.283815 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.283825 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.283840 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.283851 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.303732 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.317671 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.330198 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.339935 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.351970 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.361533 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.371569 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.385909 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.385943 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.385954 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.385968 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.385978 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.388300 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.398667 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.409259 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.421258 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.430626 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:19Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.488631 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.488668 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.488680 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.488696 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.488707 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.570529 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.570538 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.570540 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:19 crc kubenswrapper[4801]: E0122 14:05:19.570778 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:19 crc kubenswrapper[4801]: E0122 14:05:19.570650 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.570540 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:19 crc kubenswrapper[4801]: E0122 14:05:19.570850 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:19 crc kubenswrapper[4801]: E0122 14:05:19.570922 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.590865 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.590904 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.590913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.590928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.590940 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.692718 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.693047 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.693058 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.693072 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.693081 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.795863 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.795923 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.795939 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.795964 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.795981 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.897722 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.897758 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.897768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.897782 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:19 crc kubenswrapper[4801]: I0122 14:05:19.897794 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:19Z","lastTransitionTime":"2026-01-22T14:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.000061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.000110 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.000121 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.000139 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.000150 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.103397 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.103464 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.103473 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.103489 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.103501 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.131610 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 01:42:38.22219255 +0000 UTC Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.205827 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.205866 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.205878 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.205894 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.205908 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.308312 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.308362 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.308376 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.308393 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.308411 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.411019 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.411066 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.411078 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.411098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.411111 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.513886 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.513959 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.513981 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.514009 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.514030 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.616483 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.616527 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.616538 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.616555 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.616566 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.719188 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.719228 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.719239 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.719257 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.719271 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.821243 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.821279 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.821287 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.821299 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.821307 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.924088 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.924124 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.924134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.924151 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:20 crc kubenswrapper[4801]: I0122 14:05:20.924162 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:20Z","lastTransitionTime":"2026-01-22T14:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.027015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.027054 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.027064 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.027080 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.027090 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.129669 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.129703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.129714 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.129728 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.129738 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.131851 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 11:41:39.390881975 +0000 UTC Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.231978 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.232018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.232027 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.232041 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.232052 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.334913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.334963 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.334979 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.334999 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.335011 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.437953 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.438001 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.438009 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.438023 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.438033 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.540641 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.540687 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.540699 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.540717 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.540730 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.571124 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.571152 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.571199 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:21 crc kubenswrapper[4801]: E0122 14:05:21.571223 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.571238 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:21 crc kubenswrapper[4801]: E0122 14:05:21.571296 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:21 crc kubenswrapper[4801]: E0122 14:05:21.571474 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:21 crc kubenswrapper[4801]: E0122 14:05:21.571748 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.572124 4801 scope.go:117] "RemoveContainer" containerID="1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.587240 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.603615 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.615308 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.624190 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.637585 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.642867 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.644413 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.644424 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.644459 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.644476 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.648202 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.660153 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.677308 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.688890 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.701238 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.720511 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.740260 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.746552 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.746574 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.746586 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.746601 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.746612 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.750004 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.759545 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.771879 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.783092 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.794405 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.807441 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:21Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.849231 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.849255 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.849262 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.849274 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.849285 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.951469 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.951497 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.951506 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.951518 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:21 crc kubenswrapper[4801]: I0122 14:05:21.951528 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:21Z","lastTransitionTime":"2026-01-22T14:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.053637 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.053677 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.053687 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.053703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.053715 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.132591 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 12:55:46.026433912 +0000 UTC Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.155105 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.155177 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.155191 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.155207 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.155219 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.203856 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/2.log" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.206772 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.207188 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.220701 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.230820 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.249980 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.257973 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.258017 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.258028 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.258046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.258055 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.261376 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.272609 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.284743 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.309304 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.320642 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.331598 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.344850 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.358544 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.359881 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.359913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.359921 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.359935 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.359944 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.372103 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.383089 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.394748 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.405496 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.415170 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.425688 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.435429 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.461976 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.462023 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.462038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.462058 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.462072 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.564348 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.564386 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.564396 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.564412 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.564422 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.666567 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.666596 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.666605 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.666618 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.666628 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.769251 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.769292 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.769303 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.769320 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.769332 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.872008 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.872060 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.872078 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.872147 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.872167 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.974619 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.974672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.974684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.974700 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:22 crc kubenswrapper[4801]: I0122 14:05:22.974712 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:22Z","lastTransitionTime":"2026-01-22T14:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.077959 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.077999 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.078007 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.078023 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.078032 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.133230 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:00:29.318336965 +0000 UTC Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.180223 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.180265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.180274 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.180291 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.180301 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.212591 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/3.log" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.213362 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/2.log" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.218695 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" exitCode=1 Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.218757 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.218814 4801 scope.go:117] "RemoveContainer" containerID="1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.219680 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:05:23 crc kubenswrapper[4801]: E0122 14:05:23.220584 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.234359 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.245726 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.267257 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.278868 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.282232 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.282275 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.282286 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.282302 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.282314 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.291281 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.305027 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.317912 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.331213 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.340791 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.353078 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.364727 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.384640 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc0a51c71ac1824b69f99dd95680e9dc26601be0b526002a6a840d13eda4f60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:04:56Z\\\",\\\"message\\\":\\\", err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:04:56Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:04:56.238664 6405 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:22Z\\\",\\\"message\\\":\\\"ttps://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:05:22.341472 6780 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0122 14:05:22.341479 6780 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0122 14:05:22.341484 6780 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0122 14:05:22.341190 6780 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc007a6824d 0xc007a6824e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,Port:50051,TargetPort:{0 50051 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{olm.catalogSource: redhat-marketplace,olm.managed: tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.385651 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.385692 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.385704 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.385721 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.385733 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.395046 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.404631 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.422152 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.437982 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.449914 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.459735 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:23Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.489112 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.489149 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.489161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.489176 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.489187 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.570878 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.570900 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.570948 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.570949 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:23 crc kubenswrapper[4801]: E0122 14:05:23.570989 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:23 crc kubenswrapper[4801]: E0122 14:05:23.571070 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:23 crc kubenswrapper[4801]: E0122 14:05:23.571189 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:23 crc kubenswrapper[4801]: E0122 14:05:23.571296 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.592714 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.592769 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.592786 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.592809 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.592828 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.695622 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.695913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.695977 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.696045 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.696102 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.798500 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.798732 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.798792 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.798866 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.798924 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.901432 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.902013 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.902103 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.902188 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:23 crc kubenswrapper[4801]: I0122 14:05:23.902273 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:23Z","lastTransitionTime":"2026-01-22T14:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.003984 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.004241 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.004309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.004377 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.004433 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.106315 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.106714 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.106911 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.107096 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.107301 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.133644 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:58:49.387629212 +0000 UTC Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.209552 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.209603 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.209614 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.209633 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.209645 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.223145 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/3.log" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.225950 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:05:24 crc kubenswrapper[4801]: E0122 14:05:24.226116 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.243172 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.257883 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.269078 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.285907 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:22Z\\\",\\\"message\\\":\\\"ttps://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:05:22.341472 6780 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0122 14:05:22.341479 6780 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0122 14:05:22.341484 6780 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0122 14:05:22.341190 6780 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc007a6824d 0xc007a6824e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,Port:50051,TargetPort:{0 50051 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{olm.catalogSource: redhat-marketplace,olm.managed: tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:05:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.295799 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.308070 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.315596 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.315744 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.315837 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.315915 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.315973 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.320531 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.332412 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.343315 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.352895 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.363937 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.374696 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.385537 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.396754 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.409206 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.418339 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.418505 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.418571 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.418663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.418758 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.422392 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.432610 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.457266 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:24Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.521234 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.521294 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.521312 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.521335 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.521352 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.624593 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.624627 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.624634 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.624648 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.624659 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.727488 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.727783 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.727897 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.727995 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.728097 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.831918 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.832498 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.832713 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.832864 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.833043 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.935502 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.936046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.936172 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.936301 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:24 crc kubenswrapper[4801]: I0122 14:05:24.936385 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:24Z","lastTransitionTime":"2026-01-22T14:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.039441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.039494 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.039502 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.039516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.039525 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.134345 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 00:33:00.209168012 +0000 UTC Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.142061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.142101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.142115 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.142134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.142149 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.244282 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.244751 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.244944 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.245170 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.245359 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.348156 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.348205 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.348216 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.348232 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.348245 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.450510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.450538 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.450545 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.450557 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.450565 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.554039 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.554756 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.554769 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.554783 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.554793 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.572758 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:25 crc kubenswrapper[4801]: E0122 14:05:25.572878 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.573039 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:25 crc kubenswrapper[4801]: E0122 14:05:25.573083 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.573172 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:25 crc kubenswrapper[4801]: E0122 14:05:25.573213 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.573312 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:25 crc kubenswrapper[4801]: E0122 14:05:25.573360 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.657225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.657300 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.657316 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.657337 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.657350 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.759932 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.759989 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.760010 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.760032 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.760051 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.862822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.862862 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.862874 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.862928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.862940 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.965308 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.965396 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.965411 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.965437 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:25 crc kubenswrapper[4801]: I0122 14:05:25.965476 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:25Z","lastTransitionTime":"2026-01-22T14:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.067730 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.067768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.067776 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.067791 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.067799 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.135546 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 15:05:01.353689915 +0000 UTC Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.170008 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.170038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.170046 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.170061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.170070 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.272239 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.272287 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.272299 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.272320 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.272333 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.358553 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.358807 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.358889 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.358982 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.359057 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: E0122 14:05:26.371922 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:26Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.375828 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.375953 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.376027 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.376091 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.376146 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: E0122 14:05:26.390753 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:26Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.394684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.394715 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.394723 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.394737 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.394746 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: E0122 14:05:26.410123 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:26Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.415442 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.415504 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.415547 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.415569 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.415586 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: E0122 14:05:26.430299 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:26Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.433986 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.434015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.434026 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.434041 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.434050 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: E0122 14:05:26.447904 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:26Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:26 crc kubenswrapper[4801]: E0122 14:05:26.448270 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.449884 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.449917 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.449929 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.449945 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.449956 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.553511 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.553834 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.553983 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.554134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.554317 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.656516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.656560 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.656571 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.656588 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.656600 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.759239 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.759289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.759306 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.759327 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.759341 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.861764 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.862002 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.862066 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.862128 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.862374 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.964581 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.964624 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.964641 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.964662 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:26 crc kubenswrapper[4801]: I0122 14:05:26.964686 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:26Z","lastTransitionTime":"2026-01-22T14:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.067120 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.067159 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.067170 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.067186 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.067194 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.136485 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:07:05.897851972 +0000 UTC Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.170070 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.170114 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.170126 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.170146 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.170161 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.273195 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.273244 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.273255 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.273285 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.273295 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.375638 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.375733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.375764 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.375796 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.375819 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.478821 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.478870 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.478884 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.478901 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.478913 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.571172 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.571178 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.571652 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:27 crc kubenswrapper[4801]: E0122 14:05:27.571781 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:27 crc kubenswrapper[4801]: E0122 14:05:27.571815 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:27 crc kubenswrapper[4801]: E0122 14:05:27.571866 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.571889 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:27 crc kubenswrapper[4801]: E0122 14:05:27.571963 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.580885 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.582982 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.583018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.583028 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.583045 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.583057 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.685101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.685146 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.685157 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.685174 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.685185 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.787268 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.787308 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.787320 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.787337 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.787350 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.890018 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.890069 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.890080 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.890097 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.890108 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.992267 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.992302 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.992309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.992324 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:27 crc kubenswrapper[4801]: I0122 14:05:27.992333 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:27Z","lastTransitionTime":"2026-01-22T14:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.095081 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.095130 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.095142 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.095158 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.095169 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.137016 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 01:14:45.877728783 +0000 UTC Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.198358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.198405 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.198413 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.198428 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.198440 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.300924 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.301218 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.301391 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.301556 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.301687 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.403875 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.404189 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.404269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.404336 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.404403 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.507055 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.507098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.507108 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.507126 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.507137 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.609993 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.610305 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.610410 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.610536 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.610615 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.712861 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.712906 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.712917 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.712933 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.712944 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.815385 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.815429 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.815441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.815474 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.815488 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.917961 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.917994 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.918006 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.918021 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:28 crc kubenswrapper[4801]: I0122 14:05:28.918031 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:28Z","lastTransitionTime":"2026-01-22T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.022123 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.022181 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.022194 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.022217 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.022234 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.124556 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.124616 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.124634 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.124656 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.124669 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.137824 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:53:03.488849915 +0000 UTC Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.226991 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.227037 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.227048 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.227065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.227076 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.329621 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.329659 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.329672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.329689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.329699 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.432151 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.432196 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.432207 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.432223 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.432234 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.534846 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.534905 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.534921 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.534944 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.534962 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.570603 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.570613 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:29 crc kubenswrapper[4801]: E0122 14:05:29.570822 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.570911 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.570646 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:29 crc kubenswrapper[4801]: E0122 14:05:29.571076 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:29 crc kubenswrapper[4801]: E0122 14:05:29.571145 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:29 crc kubenswrapper[4801]: E0122 14:05:29.571229 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.637405 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.637460 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.637471 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.637484 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.637496 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.739513 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.739583 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.739597 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.739615 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.739650 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.841905 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.841940 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.841949 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.841964 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.841974 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.944368 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.944400 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.944408 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.944421 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:29 crc kubenswrapper[4801]: I0122 14:05:29.944430 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:29Z","lastTransitionTime":"2026-01-22T14:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.047172 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.047213 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.047225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.047240 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.047248 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.138224 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:17:24.716034062 +0000 UTC Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.149892 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.149939 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.149948 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.149962 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.149972 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.252437 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.252491 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.252510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.252528 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.252543 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.354616 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.354663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.354675 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.354691 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.354702 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.458210 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.458265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.458283 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.458304 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.458319 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.561219 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.561279 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.561297 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.561321 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.561337 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.664202 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.664245 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.664266 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.664285 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.664296 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.766955 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.767033 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.767047 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.767064 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.767076 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.869973 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.870021 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.870031 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.870099 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.870113 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.973114 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.973151 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.973161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.973176 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:30 crc kubenswrapper[4801]: I0122 14:05:30.973187 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:30Z","lastTransitionTime":"2026-01-22T14:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.075533 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.075576 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.075584 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.075598 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.075608 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.138795 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:27:46.069384888 +0000 UTC Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.177836 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.177890 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.177907 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.177928 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.177946 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.279922 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.279963 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.279974 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.279992 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.280003 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.382412 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.382440 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.382463 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.382476 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.382485 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.485594 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.485650 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.485663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.485685 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.485698 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.571325 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.571496 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.571671 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.571717 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:31 crc kubenswrapper[4801]: E0122 14:05:31.571667 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:31 crc kubenswrapper[4801]: E0122 14:05:31.571857 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:31 crc kubenswrapper[4801]: E0122 14:05:31.572002 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:31 crc kubenswrapper[4801]: E0122 14:05:31.572052 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.586265 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.587737 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.587776 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.587788 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.587805 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.587816 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.604153 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.617685 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.632476 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.655302 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:22Z\\\",\\\"message\\\":\\\"ttps://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:05:22.341472 6780 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0122 14:05:22.341479 6780 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0122 14:05:22.341484 6780 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0122 14:05:22.341190 6780 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc007a6824d 0xc007a6824e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,Port:50051,TargetPort:{0 50051 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{olm.catalogSource: redhat-marketplace,olm.managed: tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:05:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.665896 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.675779 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.685828 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a1ba4-25be-45b9-a6e5-d4f787f8c364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54c480609a0e2fcce63319e31aefb6243cff07e1d9ec999b5a08d3a2e1645c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f67ebee685ad27e1c005459ae3f4cbdf4633836ec043eb1e4935e662c45e930e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f67ebee685ad27e1c005459ae3f4cbdf4633836ec043eb1e4935e662c45e930e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.690267 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.690309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.690326 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.690350 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.690369 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.697803 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.711748 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.721901 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.731591 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.743356 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.753786 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.771567 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.790081 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.793932 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.794047 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.794060 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.794077 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.794088 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.804868 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.820995 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.835292 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:31Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.896861 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.896905 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.896916 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.896934 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.896946 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.999576 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.999626 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.999637 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.999653 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:31 crc kubenswrapper[4801]: I0122 14:05:31.999663 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:31Z","lastTransitionTime":"2026-01-22T14:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.103733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.103792 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.103803 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.103822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.103835 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.139387 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:05:42.258496227 +0000 UTC Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.205727 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.205767 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.205775 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.205788 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.205797 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.308508 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.308550 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.308563 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.308581 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.308593 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.410379 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.410419 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.410427 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.410441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.410467 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.512597 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.512640 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.512650 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.512663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.512672 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.615401 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.615433 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.615441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.615488 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.615499 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.717689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.717733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.717747 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.717768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.717783 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.820719 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.820757 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.820768 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.820783 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.820793 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.923810 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.923881 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.923897 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.923916 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:32 crc kubenswrapper[4801]: I0122 14:05:32.923929 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:32Z","lastTransitionTime":"2026-01-22T14:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.026276 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.026311 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.026320 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.026335 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.026344 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.128799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.128853 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.128869 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.128891 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.128907 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.140028 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 03:26:39.641797865 +0000 UTC Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.231364 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.231405 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.231416 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.231433 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.231460 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.334079 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.334123 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.334134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.334152 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.334164 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.436616 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.436652 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.436663 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.436679 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.436688 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.539322 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.539383 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.539406 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.539429 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.539497 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.571094 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.571172 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.571230 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:33 crc kubenswrapper[4801]: E0122 14:05:33.571331 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:33 crc kubenswrapper[4801]: E0122 14:05:33.571416 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.571520 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:33 crc kubenswrapper[4801]: E0122 14:05:33.571724 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:33 crc kubenswrapper[4801]: E0122 14:05:33.571527 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.641545 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.641616 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.641669 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.641703 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.641718 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.748282 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.748537 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.748660 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.748742 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.748811 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.850510 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.850738 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.850845 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.850925 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.850991 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.952804 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.953015 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.953087 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.953160 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:33 crc kubenswrapper[4801]: I0122 14:05:33.953243 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:33Z","lastTransitionTime":"2026-01-22T14:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.056130 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.056394 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.056593 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.056734 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.056869 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.140899 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:48:00.666024182 +0000 UTC Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.159847 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.159892 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.159903 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.159920 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.159930 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.261944 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.261985 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.261996 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.262011 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.262023 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.364667 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.364735 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.364752 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.364777 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.364795 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.468065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.468281 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.468290 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.468303 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.468312 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.570902 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.570941 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.570952 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.570967 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.570995 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.673874 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.673916 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.673927 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.673946 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.673957 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.776708 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.776773 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.776788 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.776828 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.776844 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.879477 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.879518 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.879531 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.879548 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.879559 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.982193 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.982242 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.982251 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.982267 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:34 crc kubenswrapper[4801]: I0122 14:05:34.982279 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:34Z","lastTransitionTime":"2026-01-22T14:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.085298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.085358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.085376 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.085401 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.085418 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.141869 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 08:38:48.586875342 +0000 UTC Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.187919 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.187972 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.187981 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.187995 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.188011 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.286110 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.286180 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286317 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286333 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286343 4801 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286385 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.286371559 +0000 UTC m=+147.988271742 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286516 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286555 4801 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286566 4801 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.286636 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.286616046 +0000 UTC m=+147.988516279 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.291283 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.291354 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.291380 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.291408 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.291428 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.394252 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.394301 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.394314 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.394332 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.394344 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.487397 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.487551 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.487599 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.487659 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.487628008 +0000 UTC m=+148.189528191 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.487694 4801 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.487703 4801 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.487755 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.487745922 +0000 UTC m=+148.189646105 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.487780 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.487762732 +0000 UTC m=+148.189662985 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.497076 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.497113 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.497122 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.497135 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.497148 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.570921 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.570982 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.570946 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.570931 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.571125 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.571231 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.571291 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:35 crc kubenswrapper[4801]: E0122 14:05:35.571340 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.599348 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.599390 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.599401 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.599469 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.599484 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.702001 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.702038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.702050 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.702065 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.702074 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.804033 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.804082 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.804095 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.804111 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.804123 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.907252 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.907298 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.907316 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.907341 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:35 crc kubenswrapper[4801]: I0122 14:05:35.907363 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:35Z","lastTransitionTime":"2026-01-22T14:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.010153 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.010183 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.010192 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.010205 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.010214 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.113541 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.113596 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.113613 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.113633 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.113646 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.143509 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 07:14:54.242882196 +0000 UTC Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.215315 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.215358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.215368 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.215384 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.215395 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.319207 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.319265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.319281 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.319312 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.319329 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.422002 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.422072 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.422090 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.422114 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.422132 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.509526 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.509577 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.509588 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.509608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.509621 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: E0122 14:05:36.526761 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.531523 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.531565 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.531577 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.531595 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.531605 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: E0122 14:05:36.549396 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.553408 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.553441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.553463 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.553478 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.553489 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: E0122 14:05:36.566503 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.570269 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.570306 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.570317 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.570334 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.570350 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: E0122 14:05:36.584530 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.588038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.588074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.588085 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.588101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.588112 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: E0122 14:05:36.602582 4801 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7eb95bad-466a-4ba0-acdb-d80e8372ebf6\\\",\\\"systemUUID\\\":\\\"15ca911a-5716-4218-946a-9f253061c5e8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:36Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:36 crc kubenswrapper[4801]: E0122 14:05:36.602731 4801 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.604588 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.604622 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.604646 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.604666 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.604681 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.706740 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.706771 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.706780 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.706794 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.706803 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.809134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.809190 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.809202 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.809222 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.809238 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.911690 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.911729 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.911741 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.911761 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:36 crc kubenswrapper[4801]: I0122 14:05:36.911774 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:36Z","lastTransitionTime":"2026-01-22T14:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.014666 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.014756 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.014799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.014830 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.014853 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.117788 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.117835 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.117846 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.117864 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.117876 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.144703 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 05:33:25.787379535 +0000 UTC Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.220512 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.220561 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.220572 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.220590 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.220601 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.323088 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.323125 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.323134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.323148 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.323160 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.425264 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.425314 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.425324 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.425340 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.425352 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.527903 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.527942 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.527958 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.527973 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.527987 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.571704 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:37 crc kubenswrapper[4801]: E0122 14:05:37.571865 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.571702 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:37 crc kubenswrapper[4801]: E0122 14:05:37.571930 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.571728 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.571709 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:37 crc kubenswrapper[4801]: E0122 14:05:37.571992 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:37 crc kubenswrapper[4801]: E0122 14:05:37.572207 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.630981 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.631047 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.631068 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.631099 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.631121 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.737127 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.737204 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.737244 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.737430 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.737514 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.840564 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.840629 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.840638 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.840670 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.840683 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.943836 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.943913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.943925 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.943941 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:37 crc kubenswrapper[4801]: I0122 14:05:37.943975 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:37Z","lastTransitionTime":"2026-01-22T14:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.046505 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.046584 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.046607 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.046635 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.046657 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.145493 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:00:17.336922643 +0000 UTC Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.149520 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.149559 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.149573 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.149590 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.149603 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.251547 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.251583 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.251594 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.251608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.251619 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.354082 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.354126 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.354139 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.354153 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.354164 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.456901 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.456933 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.456947 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.456970 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.456983 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.559771 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.559820 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.559832 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.559848 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.559860 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.571211 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:05:38 crc kubenswrapper[4801]: E0122 14:05:38.571376 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.662652 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.662710 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.662721 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.662740 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.663015 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.765309 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.765354 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.765363 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.765377 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.765389 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.867647 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.867680 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.867689 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.867702 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.867710 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.970001 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.970049 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.970061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.970076 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:38 crc kubenswrapper[4801]: I0122 14:05:38.970085 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:38Z","lastTransitionTime":"2026-01-22T14:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.073017 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.073064 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.073077 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.073095 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.073109 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.145663 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:29:40.747279587 +0000 UTC Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.175780 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.175830 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.175842 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.175862 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.175876 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.278098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.278131 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.278141 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.278155 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.278165 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.380718 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.380772 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.380788 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.380811 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.380825 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.483943 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.483984 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.483995 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.484013 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.484026 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.571011 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.571098 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.571124 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.571068 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:39 crc kubenswrapper[4801]: E0122 14:05:39.571238 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:39 crc kubenswrapper[4801]: E0122 14:05:39.571277 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:39 crc kubenswrapper[4801]: E0122 14:05:39.571325 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:39 crc kubenswrapper[4801]: E0122 14:05:39.571491 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.585960 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.586002 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.586013 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.586026 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.586036 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.688432 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.688497 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.688506 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.688520 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.688531 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.790609 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.790658 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.790672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.790727 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.790740 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.893188 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.893218 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.893242 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.893260 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.893272 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.995731 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.995773 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.995785 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.995800 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:39 crc kubenswrapper[4801]: I0122 14:05:39.995814 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:39Z","lastTransitionTime":"2026-01-22T14:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.098147 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.098199 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.098209 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.098227 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.098238 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.145970 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 19:52:23.460205896 +0000 UTC Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.204516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.204599 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.204618 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.204648 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.204664 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.308007 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.308358 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.308478 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.308573 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.308660 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.410738 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.410779 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.410794 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.410815 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.410828 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.513552 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.513597 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.513608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.513625 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.513638 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.615779 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.615830 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.615842 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.615861 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.615877 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.718355 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.718402 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.718413 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.718430 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.718441 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.820794 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.820843 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.820856 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.820872 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.820884 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.923217 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.923550 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.923653 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.923741 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:40 crc kubenswrapper[4801]: I0122 14:05:40.923820 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:40Z","lastTransitionTime":"2026-01-22T14:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.026888 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.027166 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.027273 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.027356 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.027434 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.129668 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.129733 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.129754 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.129782 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.129802 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.146335 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:37:02.440626881 +0000 UTC Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.231760 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.232061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.232161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.232263 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.232363 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.334470 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.334502 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.334514 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.334529 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.334541 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.437687 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.437742 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.437754 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.437767 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.437777 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.540523 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.540798 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.540867 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.540958 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.541041 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.571416 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.571433 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.571493 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.571722 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:41 crc kubenswrapper[4801]: E0122 14:05:41.571908 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:41 crc kubenswrapper[4801]: E0122 14:05:41.572014 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:41 crc kubenswrapper[4801]: E0122 14:05:41.572230 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:41 crc kubenswrapper[4801]: E0122 14:05:41.572336 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.586238 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.600650 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bf6285a2e8de7fc5589fe456c7804b162781ce7f28271463f4de3ee32a6eb8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.629577 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9e6edf0-c0bc-418b-a90c-8f86d9fa0254\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c34bbb4115f89afe081124462e20064f2ec9c35f4f9fdaf36c240aca66d0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2571bd421430da80125516cbf555c4143c5e22753263a86c7a981a351d6160a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed4850b3266e7093d340ef3a631ced271263f627155c4572703e37134b8e78dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd45eed8768113e121e6c9d6e691c37b23552129d212dd7eb68a154563dadda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842a2c5cc8d823bf2953e322fc92a7716eaf1db9e37d1a1eeea2bf5b1f22d9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4deda7b30b4e8b0963964107909196207d2bed4d4420171006143d34dfe07e8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be9faa780d960882581ec91776b788dd4b562654ddf1e8c3fba0c0b17447c7a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7aa660afadba50eba6e2ff99ef3a8e721afd8eaa6e4effa490003bedf08376b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.643992 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.644074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.644097 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.644127 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.644151 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.645896 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd6ca0b4-eefa-49a7-b6cd-70c85251d71a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3de0447c22f9198f2a4bfaf4cd478ac7a27d0f38bbf164731cb59e29285bc025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5210a33795698cbd47071aec8433d9f6a5af378edcee59cf5d314f54e1df4605\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4128d9e136fa5b8355078dbc1a7e1512e1467a8ea84aa7a22d459d17e1e7d901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.660228 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-b2p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82373306-6578-4229-851f-1d80cdabf2d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:17Z\\\",\\\"message\\\":\\\"2026-01-22T14:04:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1\\\\n2026-01-22T14:04:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_46519917-35c3-4d9f-9d71-c6b56dc809e1 to /host/opt/cni/bin/\\\\n2026-01-22T14:04:32Z [verbose] multus-daemon started\\\\n2026-01-22T14:04:32Z [verbose] Readiness Indicator file check\\\\n2026-01-22T14:05:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh2zn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-b2p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.676836 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lstn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b189f-1574-4157-ac9f-03282964c451\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6c8a67ab2c99865f9015fc0e546b5bfe3666c70d4674ea99d986034e0ebe02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90bf8fe2799ae6eb30ff76fb129741642f1da9faf5c112e8b1b2e72157b2b07a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6a82dc8fd10ca21fdab5c92c8f18da13354871235ad4a687fb4f491dbb890bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f64bbc8dc913083c5363632a3ab18bdb13b4d3da26bb4731231132ec7d7055\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9423157662ade37c623d3e518bbf31818571eec879bcb3392636409ddf087e49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69a918c952feb9337ac174b584fc7504b95e75dd2c68aaf0d20dec27e04ab2ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57d2d54836ccc0faa0a06f29425ce8c9c849c7205c17975af67c25bff14b3ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jzw75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lstn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.687711 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5bab04-307a-449a-a36d-572d1bb5c66b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55119a76c22f40fbfee9c962a5e3482d1991a40c9f4ae9300f8b08c60aa2df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e962e4ad5da08611075d0f79e483799944cafce2ae13add7e09ad688c0ad51db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s72x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rbpln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.703963 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc7331d1-1b1f-44b3-b4be-83fd708d9c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 14:04:24.970018 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 14:04:24.971834 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-561304579/tls.crt::/tmp/serving-cert-561304579/tls.key\\\\\\\"\\\\nI0122 14:04:31.056999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 14:04:31.059368 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 14:04:31.059387 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 14:04:31.059407 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 14:04:31.059424 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 14:04:31.073441 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0122 14:04:31.073497 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0122 14:04:31.073513 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073523 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 14:04:31.073533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 14:04:31.073539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 14:04:31.073545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 14:04:31.073550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0122 14:04:31.084270 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.716809 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65444675-eda5-4cb8-8d1e-81267d61fd7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d365e87405516a9d52f2c83a64084d9fcd7e4c345fe9ee9f53a3aedc4a7c7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd61271ca935d2247d9e0ad83d8aa32e11d1cb099b9b1d18c1ae6164738239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b078c68ddb2fb59c64e6d8b0986a13e15972dc50ebff8514581bdee45788ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f46cf9c89a234b8f41ffc00b2ffc877713037ec3c06be3dc48a096a1a8cd3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.730257 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05e9263c679f6c3c5c7a8a4d7b38a2c7c7fb2f7d4a6f99f8cde71a6bdb611f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.742051 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.746162 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.746213 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.746227 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.746244 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.746257 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.761325 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-22T14:05:22Z\\\",\\\"message\\\":\\\"ttps://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:22Z is after 2025-08-24T17:21:41Z]\\\\nI0122 14:05:22.341472 6780 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0122 14:05:22.341479 6780 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0122 14:05:22.341484 6780 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0122 14:05:22.341190 6780 services_controller.go:434] Service openshift-marketplace/redhat-marketplace retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{redhat-marketplace openshift-marketplace cf6d00ec-cc2c-43f6-815c-40ffd0563e71 5558 0 2025-02-23 05:23:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[olm.managed:true olm.service-spec-hash:aUeLNNcZzVZO2rcaZ5Kc8V3jffO0Ss4T6qX6V5] map[] [{operators.coreos.com/v1alpha1 CatalogSource redhat-marketplace fcb55c30-a739-4bc1-9c9c-7634e05a3dbd 0xc007a6824d 0xc007a6824e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:grpc,Protocol:TCP,Port:50051,TargetPort:{0 50051 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{olm.catalogSource: redhat-marketplace,olm.managed: tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T14:05:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvwm2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx7sl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.772307 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jnfrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31dbc047-7646-4c7b-a9e8-3b8949ea027d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708e0705a2f3ef8dac598b183b0f2a49979721ad5da8380c61a0a1f5c4efddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmcv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jnfrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.782492 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18bcc554-da90-40e9-b32f-e0d5d0936faa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8hr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ph2s5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.791311 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a1ba4-25be-45b9-a6e5-d4f787f8c364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54c480609a0e2fcce63319e31aefb6243cff07e1d9ec999b5a08d3a2e1645c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f67ebee685ad27e1c005459ae3f4cbdf4633836ec043eb1e4935e662c45e930e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f67ebee685ad27e1c005459ae3f4cbdf4633836ec043eb1e4935e662c45e930e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T14:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T14:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.801622 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e31583b6300964053edee5f2f34f729eaaf5b0f6ffe5da0c23ea55bc948010b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccb19cc889fe978021c2a1845bba7bbb42319331f7ada05f125733b0f70e4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.812623 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.821351 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gf7j5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdefaaa4-21e6-4d0b-bf3f-a0dcbbc44ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34bce852973822645010b41715ebbc94a13237210bf0861fbe1f964288131586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54ncw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gf7j5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.830431 4801 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b842046-5300-4281-9d73-3ae42f0d56da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T14:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ac6b516e4eb9feb6b169856731f6973e299f907cdf26898b0b36a9a3ba56183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T14:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntz2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T14:04:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5t2tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T14:05:41Z is after 2025-08-24T17:21:41Z" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.848210 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.848265 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.848277 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.848291 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.848299 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.951391 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.951501 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.951516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.951539 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:41 crc kubenswrapper[4801]: I0122 14:05:41.951554 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:41Z","lastTransitionTime":"2026-01-22T14:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.053858 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.053899 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.053909 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.053924 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.053937 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.147877 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 20:16:34.328374171 +0000 UTC Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.156260 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.156303 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.156316 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.156332 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.156344 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.258754 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.258805 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.258816 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.258833 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.258846 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.361180 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.361213 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.361225 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.361240 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.361251 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.463515 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.463557 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.463564 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.463578 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.463587 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.566145 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.566188 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.566200 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.566216 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.566227 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.668255 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.668286 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.668293 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.668308 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.668326 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.771973 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.772025 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.772037 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.772058 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.772071 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.874154 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.874195 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.874205 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.874221 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.874232 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.976608 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.976661 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.976675 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.976695 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:42 crc kubenswrapper[4801]: I0122 14:05:42.976707 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:42Z","lastTransitionTime":"2026-01-22T14:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.079523 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.079564 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.079575 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.079599 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.079613 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.148741 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 16:33:15.800418484 +0000 UTC Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.182254 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.182289 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.182301 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.182318 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.182330 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.283984 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.284026 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.284037 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.284053 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.284064 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.386770 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.386811 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.386822 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.386837 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.386849 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.489541 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.489585 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.489600 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.489616 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.489627 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.571229 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.571252 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.571480 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:43 crc kubenswrapper[4801]: E0122 14:05:43.571377 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:43 crc kubenswrapper[4801]: E0122 14:05:43.571628 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:43 crc kubenswrapper[4801]: E0122 14:05:43.571725 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.571919 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:43 crc kubenswrapper[4801]: E0122 14:05:43.572011 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.591898 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.592137 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.592202 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.592281 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.592379 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.694169 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.694203 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.694214 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.694230 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.694241 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.797107 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.797157 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.797166 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.797185 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.797195 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.899412 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.899491 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.899508 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.899529 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:43 crc kubenswrapper[4801]: I0122 14:05:43.899543 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:43Z","lastTransitionTime":"2026-01-22T14:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.002741 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.002795 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.002809 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.002832 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.002848 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.105645 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.105729 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.105743 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.105767 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.105782 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.150158 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 22:12:16.573267718 +0000 UTC Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.208393 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.208491 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.208506 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.208543 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.208559 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.311117 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.311161 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.311173 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.311191 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.311203 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.413958 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.414020 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.414038 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.414061 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.414077 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.516083 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.516134 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.516146 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.516168 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.516180 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.618602 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.618660 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.618684 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.618706 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.618720 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.721347 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.721414 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.721434 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.721491 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.721514 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.824413 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.824521 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.824545 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.824574 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.824598 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.927847 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.927900 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.927913 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.927931 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:44 crc kubenswrapper[4801]: I0122 14:05:44.927944 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:44Z","lastTransitionTime":"2026-01-22T14:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.029742 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.029803 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.029816 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.029834 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.029846 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.132653 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.132710 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.132720 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.132737 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.132750 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.151231 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 21:12:24.386934653 +0000 UTC Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.235028 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.235098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.235113 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.235135 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.235150 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.339971 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.340062 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.340077 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.340098 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.340118 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.443270 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.443327 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.443344 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.443367 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.443384 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.545748 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.545790 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.545801 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.545816 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.545827 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.571673 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.571729 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.571754 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:45 crc kubenswrapper[4801]: E0122 14:05:45.571820 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.572038 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:45 crc kubenswrapper[4801]: E0122 14:05:45.572093 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:45 crc kubenswrapper[4801]: E0122 14:05:45.572234 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:45 crc kubenswrapper[4801]: E0122 14:05:45.572371 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.648540 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.648592 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.648609 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.648631 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.648649 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.751373 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.751441 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.751493 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.751516 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.751538 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.853930 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.853977 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.853990 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.854016 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.854032 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.957368 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.957409 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.957421 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.957437 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:45 crc kubenswrapper[4801]: I0122 14:05:45.957472 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:45Z","lastTransitionTime":"2026-01-22T14:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.060034 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.060074 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.060101 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.060115 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.060126 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.151516 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 08:05:16.32219318 +0000 UTC Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.162334 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.162375 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.162388 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.162404 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.162416 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.265672 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.265720 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.265729 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.265744 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.265755 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.368280 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.368319 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.368329 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.368343 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.368353 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.471799 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.471882 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.471902 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.471929 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.471958 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.575196 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.575262 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.575272 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.575286 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.575295 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.677430 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.677508 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.677521 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.677538 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.677550 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.764085 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.764120 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.764128 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.764140 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.764148 4801 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T14:05:46Z","lastTransitionTime":"2026-01-22T14:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.808873 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj"] Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.809275 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.811170 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.811474 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.811625 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.814124 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.841382 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b2p9x" podStartSLOduration=75.841362604 podStartE2EDuration="1m15.841362604s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.826116227 +0000 UTC m=+95.528016460" watchObservedRunningTime="2026-01-22 14:05:46.841362604 +0000 UTC m=+95.543262787" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.850829 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lstn7" podStartSLOduration=75.850806755 podStartE2EDuration="1m15.850806755s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.841292682 +0000 UTC m=+95.543192875" watchObservedRunningTime="2026-01-22 14:05:46.850806755 +0000 UTC m=+95.552706938" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.874642 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.874621668 podStartE2EDuration="1m15.874621668s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.874485914 +0000 UTC m=+95.576386117" watchObservedRunningTime="2026-01-22 14:05:46.874621668 +0000 UTC m=+95.576521851" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.874891 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rbpln" podStartSLOduration=74.874883395 podStartE2EDuration="1m14.874883395s" podCreationTimestamp="2026-01-22 14:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.851484894 +0000 UTC m=+95.553385087" watchObservedRunningTime="2026-01-22 14:05:46.874883395 +0000 UTC m=+95.576783578" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.894833 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.894817297 podStartE2EDuration="1m8.894817297s" podCreationTimestamp="2026-01-22 14:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.894602591 +0000 UTC m=+95.596502784" watchObservedRunningTime="2026-01-22 14:05:46.894817297 +0000 UTC m=+95.596717480" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.957942 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jnfrc" podStartSLOduration=76.957921656 podStartE2EDuration="1m16.957921656s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.957660748 +0000 UTC m=+95.659560931" watchObservedRunningTime="2026-01-22 14:05:46.957921656 +0000 UTC m=+95.659821839" Jan 22 14:05:46 crc kubenswrapper[4801]: I0122 14:05:46.984613 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.98459389 podStartE2EDuration="1m15.98459389s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:46.98319868 +0000 UTC m=+95.685098863" watchObservedRunningTime="2026-01-22 14:05:46.98459389 +0000 UTC m=+95.686494073" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.009077 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/10f3ac2f-9289-487c-b062-6d36a4ab0508-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.009374 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10f3ac2f-9289-487c-b062-6d36a4ab0508-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.009531 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10f3ac2f-9289-487c-b062-6d36a4ab0508-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.009665 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f3ac2f-9289-487c-b062-6d36a4ab0508-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.009782 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/10f3ac2f-9289-487c-b062-6d36a4ab0508-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.023705 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.023683341 podStartE2EDuration="44.023683341s" podCreationTimestamp="2026-01-22 14:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:47.023389052 +0000 UTC m=+95.725289245" watchObservedRunningTime="2026-01-22 14:05:47.023683341 +0000 UTC m=+95.725583524" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.071037 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gf7j5" podStartSLOduration=77.071016848 podStartE2EDuration="1m17.071016848s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:47.070907405 +0000 UTC m=+95.772807588" watchObservedRunningTime="2026-01-22 14:05:47.071016848 +0000 UTC m=+95.772917031" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.095404 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podStartSLOduration=76.095382516 podStartE2EDuration="1m16.095382516s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:47.081294952 +0000 UTC m=+95.783195135" watchObservedRunningTime="2026-01-22 14:05:47.095382516 +0000 UTC m=+95.797282699" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.104590 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=20.104563499 podStartE2EDuration="20.104563499s" podCreationTimestamp="2026-01-22 14:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:47.104369544 +0000 UTC m=+95.806269737" watchObservedRunningTime="2026-01-22 14:05:47.104563499 +0000 UTC m=+95.806463682" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111255 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/10f3ac2f-9289-487c-b062-6d36a4ab0508-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111319 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/10f3ac2f-9289-487c-b062-6d36a4ab0508-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111353 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10f3ac2f-9289-487c-b062-6d36a4ab0508-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111388 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10f3ac2f-9289-487c-b062-6d36a4ab0508-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111411 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f3ac2f-9289-487c-b062-6d36a4ab0508-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111464 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/10f3ac2f-9289-487c-b062-6d36a4ab0508-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.111511 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/10f3ac2f-9289-487c-b062-6d36a4ab0508-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.112172 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10f3ac2f-9289-487c-b062-6d36a4ab0508-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.117204 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f3ac2f-9289-487c-b062-6d36a4ab0508-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.128535 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10f3ac2f-9289-487c-b062-6d36a4ab0508-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sm5tj\" (UID: \"10f3ac2f-9289-487c-b062-6d36a4ab0508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.151960 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 08:27:35.143072337 +0000 UTC Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.152070 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.161667 4801 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.423617 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.571267 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.571345 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.571285 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:47 crc kubenswrapper[4801]: E0122 14:05:47.571405 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:47 crc kubenswrapper[4801]: I0122 14:05:47.571468 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:47 crc kubenswrapper[4801]: E0122 14:05:47.571582 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:47 crc kubenswrapper[4801]: E0122 14:05:47.571657 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:47 crc kubenswrapper[4801]: E0122 14:05:47.571770 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:48 crc kubenswrapper[4801]: I0122 14:05:48.299033 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" event={"ID":"10f3ac2f-9289-487c-b062-6d36a4ab0508","Type":"ContainerStarted","Data":"2ca7efc199d79ab9b12a2cc8cc4305498a78b3fbc759e8eca691c35e14361ee4"} Jan 22 14:05:48 crc kubenswrapper[4801]: I0122 14:05:48.299090 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" event={"ID":"10f3ac2f-9289-487c-b062-6d36a4ab0508","Type":"ContainerStarted","Data":"106eed98216c9864439022e8fe1d19078b9004bb850809ddf563d571a867a70b"} Jan 22 14:05:48 crc kubenswrapper[4801]: I0122 14:05:48.312108 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sm5tj" podStartSLOduration=77.312088875 podStartE2EDuration="1m17.312088875s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:05:48.31193333 +0000 UTC m=+97.013833543" watchObservedRunningTime="2026-01-22 14:05:48.312088875 +0000 UTC m=+97.013989058" Jan 22 14:05:49 crc kubenswrapper[4801]: I0122 14:05:49.571176 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:49 crc kubenswrapper[4801]: I0122 14:05:49.571216 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:49 crc kubenswrapper[4801]: I0122 14:05:49.571288 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:49 crc kubenswrapper[4801]: I0122 14:05:49.571176 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:49 crc kubenswrapper[4801]: E0122 14:05:49.571342 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:49 crc kubenswrapper[4801]: E0122 14:05:49.571510 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:49 crc kubenswrapper[4801]: E0122 14:05:49.571601 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:49 crc kubenswrapper[4801]: E0122 14:05:49.571659 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:49 crc kubenswrapper[4801]: I0122 14:05:49.732488 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:49 crc kubenswrapper[4801]: E0122 14:05:49.732689 4801 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:05:49 crc kubenswrapper[4801]: E0122 14:05:49.732786 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs podName:18bcc554-da90-40e9-b32f-e0d5d0936faa nodeName:}" failed. No retries permitted until 2026-01-22 14:06:53.732766299 +0000 UTC m=+162.434666502 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs") pod "network-metrics-daemon-ph2s5" (UID: "18bcc554-da90-40e9-b32f-e0d5d0936faa") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 14:05:51 crc kubenswrapper[4801]: I0122 14:05:51.570891 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:51 crc kubenswrapper[4801]: I0122 14:05:51.570974 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:51 crc kubenswrapper[4801]: E0122 14:05:51.573360 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:51 crc kubenswrapper[4801]: I0122 14:05:51.573434 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:51 crc kubenswrapper[4801]: I0122 14:05:51.573524 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:51 crc kubenswrapper[4801]: E0122 14:05:51.573723 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:51 crc kubenswrapper[4801]: E0122 14:05:51.574138 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:51 crc kubenswrapper[4801]: E0122 14:05:51.574240 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:52 crc kubenswrapper[4801]: I0122 14:05:52.571922 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:05:52 crc kubenswrapper[4801]: E0122 14:05:52.572123 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx7sl_openshift-ovn-kubernetes(33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" Jan 22 14:05:53 crc kubenswrapper[4801]: I0122 14:05:53.570365 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:53 crc kubenswrapper[4801]: I0122 14:05:53.570561 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:53 crc kubenswrapper[4801]: I0122 14:05:53.570711 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:53 crc kubenswrapper[4801]: E0122 14:05:53.570720 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:53 crc kubenswrapper[4801]: I0122 14:05:53.570744 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:53 crc kubenswrapper[4801]: E0122 14:05:53.570886 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:53 crc kubenswrapper[4801]: E0122 14:05:53.571025 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:53 crc kubenswrapper[4801]: E0122 14:05:53.571100 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:55 crc kubenswrapper[4801]: I0122 14:05:55.571625 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:55 crc kubenswrapper[4801]: I0122 14:05:55.571674 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:55 crc kubenswrapper[4801]: I0122 14:05:55.571634 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:55 crc kubenswrapper[4801]: E0122 14:05:55.571795 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:55 crc kubenswrapper[4801]: E0122 14:05:55.571941 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:55 crc kubenswrapper[4801]: E0122 14:05:55.572009 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:55 crc kubenswrapper[4801]: I0122 14:05:55.572272 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:55 crc kubenswrapper[4801]: E0122 14:05:55.572359 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:57 crc kubenswrapper[4801]: I0122 14:05:57.570821 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:57 crc kubenswrapper[4801]: I0122 14:05:57.570860 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:57 crc kubenswrapper[4801]: E0122 14:05:57.570964 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:57 crc kubenswrapper[4801]: I0122 14:05:57.570837 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:57 crc kubenswrapper[4801]: E0122 14:05:57.571074 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:57 crc kubenswrapper[4801]: I0122 14:05:57.571152 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:57 crc kubenswrapper[4801]: E0122 14:05:57.571188 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:57 crc kubenswrapper[4801]: E0122 14:05:57.571403 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:05:59 crc kubenswrapper[4801]: I0122 14:05:59.570885 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:05:59 crc kubenswrapper[4801]: E0122 14:05:59.571016 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:05:59 crc kubenswrapper[4801]: I0122 14:05:59.571087 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:05:59 crc kubenswrapper[4801]: I0122 14:05:59.571112 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:05:59 crc kubenswrapper[4801]: I0122 14:05:59.571108 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:05:59 crc kubenswrapper[4801]: E0122 14:05:59.571503 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:05:59 crc kubenswrapper[4801]: E0122 14:05:59.571597 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:05:59 crc kubenswrapper[4801]: E0122 14:05:59.571713 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:01 crc kubenswrapper[4801]: I0122 14:06:01.570731 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:01 crc kubenswrapper[4801]: I0122 14:06:01.570784 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:01 crc kubenswrapper[4801]: I0122 14:06:01.570849 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:01 crc kubenswrapper[4801]: I0122 14:06:01.570917 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:01 crc kubenswrapper[4801]: E0122 14:06:01.573250 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:01 crc kubenswrapper[4801]: E0122 14:06:01.573394 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:01 crc kubenswrapper[4801]: E0122 14:06:01.573621 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:01 crc kubenswrapper[4801]: E0122 14:06:01.573755 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:03 crc kubenswrapper[4801]: I0122 14:06:03.571190 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:03 crc kubenswrapper[4801]: I0122 14:06:03.571247 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:03 crc kubenswrapper[4801]: I0122 14:06:03.571189 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:03 crc kubenswrapper[4801]: I0122 14:06:03.571366 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:03 crc kubenswrapper[4801]: E0122 14:06:03.571571 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:03 crc kubenswrapper[4801]: E0122 14:06:03.571748 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:03 crc kubenswrapper[4801]: E0122 14:06:03.571936 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:03 crc kubenswrapper[4801]: E0122 14:06:03.572052 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:04 crc kubenswrapper[4801]: I0122 14:06:04.356559 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/1.log" Jan 22 14:06:04 crc kubenswrapper[4801]: I0122 14:06:04.357102 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/0.log" Jan 22 14:06:04 crc kubenswrapper[4801]: I0122 14:06:04.357172 4801 generic.go:334] "Generic (PLEG): container finished" podID="82373306-6578-4229-851f-1d80cdabf2d7" containerID="bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4" exitCode=1 Jan 22 14:06:04 crc kubenswrapper[4801]: I0122 14:06:04.357213 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerDied","Data":"bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4"} Jan 22 14:06:04 crc kubenswrapper[4801]: I0122 14:06:04.357258 4801 scope.go:117] "RemoveContainer" containerID="da7bc7769b00937c594953b90f4fc9cb2b1f6f40a5871157b4d1e09ffef18074" Jan 22 14:06:04 crc kubenswrapper[4801]: I0122 14:06:04.357624 4801 scope.go:117] "RemoveContainer" containerID="bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4" Jan 22 14:06:04 crc kubenswrapper[4801]: E0122 14:06:04.357773 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-b2p9x_openshift-multus(82373306-6578-4229-851f-1d80cdabf2d7)\"" pod="openshift-multus/multus-b2p9x" podUID="82373306-6578-4229-851f-1d80cdabf2d7" Jan 22 14:06:05 crc kubenswrapper[4801]: I0122 14:06:05.361426 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/1.log" Jan 22 14:06:05 crc kubenswrapper[4801]: I0122 14:06:05.570853 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:05 crc kubenswrapper[4801]: I0122 14:06:05.570930 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:05 crc kubenswrapper[4801]: E0122 14:06:05.571297 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:05 crc kubenswrapper[4801]: I0122 14:06:05.571322 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:05 crc kubenswrapper[4801]: I0122 14:06:05.571395 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:05 crc kubenswrapper[4801]: E0122 14:06:05.571996 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:05 crc kubenswrapper[4801]: I0122 14:06:05.572584 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:06:05 crc kubenswrapper[4801]: E0122 14:06:05.572609 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:05 crc kubenswrapper[4801]: E0122 14:06:05.572740 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:06 crc kubenswrapper[4801]: I0122 14:06:06.366669 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/3.log" Jan 22 14:06:06 crc kubenswrapper[4801]: I0122 14:06:06.368982 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerStarted","Data":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} Jan 22 14:06:06 crc kubenswrapper[4801]: I0122 14:06:06.369963 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:06:06 crc kubenswrapper[4801]: I0122 14:06:06.394618 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podStartSLOduration=95.394601646 podStartE2EDuration="1m35.394601646s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:06.393504285 +0000 UTC m=+115.095404468" watchObservedRunningTime="2026-01-22 14:06:06.394601646 +0000 UTC m=+115.096501829" Jan 22 14:06:06 crc kubenswrapper[4801]: I0122 14:06:06.622922 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ph2s5"] Jan 22 14:06:06 crc kubenswrapper[4801]: I0122 14:06:06.623024 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:06 crc kubenswrapper[4801]: E0122 14:06:06.623114 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:07 crc kubenswrapper[4801]: I0122 14:06:07.570882 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:07 crc kubenswrapper[4801]: I0122 14:06:07.570972 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:07 crc kubenswrapper[4801]: E0122 14:06:07.571029 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:07 crc kubenswrapper[4801]: E0122 14:06:07.571173 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:07 crc kubenswrapper[4801]: I0122 14:06:07.571301 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:07 crc kubenswrapper[4801]: E0122 14:06:07.571482 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:08 crc kubenswrapper[4801]: I0122 14:06:08.570837 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:08 crc kubenswrapper[4801]: E0122 14:06:08.570960 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:09 crc kubenswrapper[4801]: I0122 14:06:09.570720 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:09 crc kubenswrapper[4801]: I0122 14:06:09.570742 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:09 crc kubenswrapper[4801]: E0122 14:06:09.570931 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:09 crc kubenswrapper[4801]: E0122 14:06:09.571049 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:09 crc kubenswrapper[4801]: I0122 14:06:09.571529 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:09 crc kubenswrapper[4801]: E0122 14:06:09.571680 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:10 crc kubenswrapper[4801]: I0122 14:06:10.571175 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:10 crc kubenswrapper[4801]: E0122 14:06:10.571422 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:11 crc kubenswrapper[4801]: E0122 14:06:11.547873 4801 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 22 14:06:11 crc kubenswrapper[4801]: I0122 14:06:11.571058 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:11 crc kubenswrapper[4801]: I0122 14:06:11.571146 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:11 crc kubenswrapper[4801]: I0122 14:06:11.571235 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:11 crc kubenswrapper[4801]: E0122 14:06:11.573319 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:11 crc kubenswrapper[4801]: E0122 14:06:11.573687 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:11 crc kubenswrapper[4801]: E0122 14:06:11.573802 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:11 crc kubenswrapper[4801]: E0122 14:06:11.653102 4801 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 22 14:06:12 crc kubenswrapper[4801]: I0122 14:06:12.571033 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:12 crc kubenswrapper[4801]: E0122 14:06:12.571724 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:13 crc kubenswrapper[4801]: I0122 14:06:13.570950 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:13 crc kubenswrapper[4801]: I0122 14:06:13.571096 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:13 crc kubenswrapper[4801]: E0122 14:06:13.571266 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:13 crc kubenswrapper[4801]: I0122 14:06:13.571290 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:13 crc kubenswrapper[4801]: E0122 14:06:13.571519 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:13 crc kubenswrapper[4801]: E0122 14:06:13.571655 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:14 crc kubenswrapper[4801]: I0122 14:06:14.571223 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:14 crc kubenswrapper[4801]: E0122 14:06:14.571359 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:15 crc kubenswrapper[4801]: I0122 14:06:15.570735 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:15 crc kubenswrapper[4801]: E0122 14:06:15.571112 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:15 crc kubenswrapper[4801]: I0122 14:06:15.570841 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:15 crc kubenswrapper[4801]: I0122 14:06:15.570762 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:15 crc kubenswrapper[4801]: E0122 14:06:15.571176 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:15 crc kubenswrapper[4801]: E0122 14:06:15.571421 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:16 crc kubenswrapper[4801]: I0122 14:06:16.570861 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:16 crc kubenswrapper[4801]: E0122 14:06:16.571005 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:16 crc kubenswrapper[4801]: I0122 14:06:16.571436 4801 scope.go:117] "RemoveContainer" containerID="bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4" Jan 22 14:06:16 crc kubenswrapper[4801]: E0122 14:06:16.654471 4801 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 22 14:06:17 crc kubenswrapper[4801]: I0122 14:06:17.411253 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/1.log" Jan 22 14:06:17 crc kubenswrapper[4801]: I0122 14:06:17.411356 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerStarted","Data":"c3af53c510e8852d6f71e64a3ff43342641d6b251fd5f757632ac020558b170b"} Jan 22 14:06:17 crc kubenswrapper[4801]: I0122 14:06:17.570662 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:17 crc kubenswrapper[4801]: I0122 14:06:17.570702 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:17 crc kubenswrapper[4801]: E0122 14:06:17.570786 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:17 crc kubenswrapper[4801]: I0122 14:06:17.570808 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:17 crc kubenswrapper[4801]: E0122 14:06:17.571072 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:17 crc kubenswrapper[4801]: E0122 14:06:17.571015 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:18 crc kubenswrapper[4801]: I0122 14:06:18.571107 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:18 crc kubenswrapper[4801]: E0122 14:06:18.571425 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:19 crc kubenswrapper[4801]: I0122 14:06:19.570856 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:19 crc kubenswrapper[4801]: E0122 14:06:19.571001 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:19 crc kubenswrapper[4801]: I0122 14:06:19.570882 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:19 crc kubenswrapper[4801]: I0122 14:06:19.571092 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:19 crc kubenswrapper[4801]: E0122 14:06:19.571174 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:19 crc kubenswrapper[4801]: E0122 14:06:19.571227 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:20 crc kubenswrapper[4801]: I0122 14:06:20.570611 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:20 crc kubenswrapper[4801]: E0122 14:06:20.570950 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ph2s5" podUID="18bcc554-da90-40e9-b32f-e0d5d0936faa" Jan 22 14:06:21 crc kubenswrapper[4801]: I0122 14:06:21.571251 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:21 crc kubenswrapper[4801]: E0122 14:06:21.573985 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 14:06:21 crc kubenswrapper[4801]: I0122 14:06:21.574060 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:21 crc kubenswrapper[4801]: I0122 14:06:21.574090 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:21 crc kubenswrapper[4801]: E0122 14:06:21.574177 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 14:06:21 crc kubenswrapper[4801]: E0122 14:06:21.574288 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 14:06:22 crc kubenswrapper[4801]: I0122 14:06:22.570842 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:22 crc kubenswrapper[4801]: I0122 14:06:22.573207 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 22 14:06:22 crc kubenswrapper[4801]: I0122 14:06:22.575412 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.570984 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.571014 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.571175 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.573670 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.573795 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.573798 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 22 14:06:23 crc kubenswrapper[4801]: I0122 14:06:23.574182 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 22 14:06:25 crc kubenswrapper[4801]: I0122 14:06:25.460083 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.143834 4801 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.171231 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zxs7c"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.171649 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.173012 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-g5mt6"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.173463 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.174990 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcjmg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.175510 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.175680 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.176419 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.184799 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.184874 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.190593 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.190616 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 22 14:06:28 crc kubenswrapper[4801]: W0122 14:06:28.190756 4801 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Jan 22 14:06:28 crc kubenswrapper[4801]: E0122 14:06:28.190803 4801 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.190872 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 22 14:06:28 crc kubenswrapper[4801]: W0122 14:06:28.190883 4801 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.190899 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: E0122 14:06:28.190917 4801 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 14:06:28 crc kubenswrapper[4801]: W0122 14:06:28.190969 4801 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Jan 22 14:06:28 crc kubenswrapper[4801]: E0122 14:06:28.190980 4801 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191044 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191130 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191171 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191257 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191264 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191269 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191277 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191340 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 22 14:06:28 crc kubenswrapper[4801]: W0122 14:06:28.191386 4801 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Jan 22 14:06:28 crc kubenswrapper[4801]: E0122 14:06:28.191408 4801 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191760 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 22 14:06:28 crc kubenswrapper[4801]: W0122 14:06:28.191826 4801 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Jan 22 14:06:28 crc kubenswrapper[4801]: E0122 14:06:28.191945 4801 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191965 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.191994 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192008 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192016 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192072 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192095 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192228 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr"] Jan 22 14:06:28 crc kubenswrapper[4801]: W0122 14:06:28.192341 4801 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192426 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192498 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192579 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 22 14:06:28 crc kubenswrapper[4801]: E0122 14:06:28.192688 4801 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192711 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192827 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.192903 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.195014 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.195548 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.196021 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.196290 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.196985 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.197733 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.205164 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-sx6df"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.205939 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216135 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216520 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216607 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216656 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216671 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216720 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216622 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216848 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.216920 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.217045 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.217355 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.218289 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.218750 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.219512 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.219613 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.219729 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.222659 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.222834 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.223738 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.224297 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.226668 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.228890 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tzbl4"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.229516 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.234774 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.234835 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.234774 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.234763 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.234939 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235038 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235069 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235101 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235075 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235162 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235171 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235206 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nbhfx"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235284 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235467 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.235682 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.236351 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.236431 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.236519 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.236785 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fpc52"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.236851 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.237306 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.237632 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-glbkg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.238061 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.238743 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zxs7c"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.239951 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-g5mt6"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.240581 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6ddc9"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.240886 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.241805 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.242061 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.242407 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.242632 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.243007 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcjmg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.243085 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.243407 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.243751 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-57pdz"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.244071 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264466 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-node-pullsecrets\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264504 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264528 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cn5\" (UniqueName: \"kubernetes.io/projected/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-kube-api-access-p2cn5\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264547 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpm4\" (UniqueName: \"kubernetes.io/projected/44be3897-5e75-4a25-b586-79c33d07da2c-kube-api-access-bvpm4\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264593 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-audit-policies\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264733 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264764 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264795 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44be3897-5e75-4a25-b586-79c33d07da2c-trusted-ca\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264812 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264827 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264880 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264912 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264928 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llpt7\" (UniqueName: \"kubernetes.io/projected/83688763-e6ec-4879-b214-03d2e00da08e-kube-api-access-llpt7\") pod \"cluster-samples-operator-665b6dd947-glntr\" (UID: \"83688763-e6ec-4879-b214-03d2e00da08e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264945 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9z7\" (UniqueName: \"kubernetes.io/projected/7d2c98a4-f3a8-4200-8432-6f68459320ca-kube-api-access-pr9z7\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264960 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264978 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.264993 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265010 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsms4\" (UniqueName: \"kubernetes.io/projected/5f171ae3-24fe-42e0-b8dd-0e733fc33381-kube-api-access-dsms4\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265024 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-audit-dir\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265039 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265061 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265077 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265095 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrfb\" (UniqueName: \"kubernetes.io/projected/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-kube-api-access-fxrfb\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265118 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44be3897-5e75-4a25-b586-79c33d07da2c-serving-cert\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265135 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-encryption-config\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265149 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-etcd-client\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265165 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f171ae3-24fe-42e0-b8dd-0e733fc33381-serving-cert\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265186 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265200 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265218 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265241 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-audit-dir\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265257 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/83688763-e6ec-4879-b214-03d2e00da08e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-glntr\" (UID: \"83688763-e6ec-4879-b214-03d2e00da08e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265271 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-client-ca\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265287 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265302 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265318 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265332 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-policies\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265346 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265361 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44be3897-5e75-4a25-b586-79c33d07da2c-config\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265374 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-serving-cert\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265389 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-config\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265405 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-audit\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.265420 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-dir\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.268487 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.269660 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.269931 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.276290 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.276508 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.276610 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.276669 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.276851 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.277262 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.282249 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.282482 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.282549 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.282632 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.282739 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.283645 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.284028 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.284101 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.284478 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.284538 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.284496 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.286610 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.286650 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.286666 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.286804 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.286880 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.286961 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.287032 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.288538 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.289046 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rg8c5"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.289603 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.289848 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.291026 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.291325 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.291920 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ttggq"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.292213 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.292545 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.293694 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.294141 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.299109 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.300737 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xcfrl"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.307474 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.308084 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.308552 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.308652 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.308962 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.309212 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.309353 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sx6df"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.309374 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.300823 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.309751 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.309770 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.310053 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.310252 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.310274 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.310387 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.310810 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.313517 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.315292 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.315326 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.315828 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.317523 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zckqh"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.318027 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.338660 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.338729 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.342174 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sb84b"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.342270 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.342194 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.342780 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.343572 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nbhfx"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.344636 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.345375 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.347414 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tzbl4"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.349034 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.349538 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.351208 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.351817 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.351845 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mg4vw"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.352834 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.354515 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.355133 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.355386 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.356865 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.357103 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-glbkg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.358673 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fpc52"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.360428 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gwkcf"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.360984 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.361648 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6ddc9"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.363383 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.364609 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.365674 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366195 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366230 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd78m\" (UniqueName: \"kubernetes.io/projected/74adbb74-aba6-4b36-a89f-d8f09ad1b241-kube-api-access-xd78m\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366260 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44be3897-5e75-4a25-b586-79c33d07da2c-config\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366286 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-serving-cert\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366309 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-config\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366861 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-client\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366900 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa6610e-0cfe-4a02-be2d-014a2ad84215-auth-proxy-config\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366940 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-audit\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366964 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145bd263-96be-4598-ad6c-3c9a73b92d26-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.366991 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-dir\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367020 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367044 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6f9\" (UniqueName: \"kubernetes.io/projected/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-kube-api-access-6s6f9\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367067 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5lrs\" (UniqueName: \"kubernetes.io/projected/aa063689-071e-47b5-85d1-2bfe8d3b5fec-kube-api-access-k5lrs\") pod \"downloads-7954f5f757-sx6df\" (UID: \"aa063689-071e-47b5-85d1-2bfe8d3b5fec\") " pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367091 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367153 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367172 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-node-pullsecrets\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367239 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-oauth-config\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367271 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-trusted-ca-bundle\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367317 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltdg5\" (UniqueName: \"kubernetes.io/projected/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-kube-api-access-ltdg5\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367339 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9790065c-0263-4186-b043-a531f264b880-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367387 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzld7\" (UniqueName: \"kubernetes.io/projected/6ba12591-7bc1-4df8-a9c0-d515689eefe5-kube-api-access-kzld7\") pod \"dns-operator-744455d44c-rg8c5\" (UID: \"6ba12591-7bc1-4df8-a9c0-d515689eefe5\") " pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367415 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cn5\" (UniqueName: \"kubernetes.io/projected/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-kube-api-access-p2cn5\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367497 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt945\" (UniqueName: \"kubernetes.io/projected/322bcacd-4b58-46cf-b37e-2ffda3f87b24-kube-api-access-nt945\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367524 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9790065c-0263-4186-b043-a531f264b880-config\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367567 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-config\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367591 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367616 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-ca\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367658 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-config\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367702 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpm4\" (UniqueName: \"kubernetes.io/projected/44be3897-5e75-4a25-b586-79c33d07da2c-kube-api-access-bvpm4\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367751 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-node-pullsecrets\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367753 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-audit-policies\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367827 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367845 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367860 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-config\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367868 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-service-ca\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367914 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-audit\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367927 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpx77\" (UniqueName: \"kubernetes.io/projected/801b8dae-b259-4026-816a-794d9bcc81a4-kube-api-access-dpx77\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367975 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xczg\" (UniqueName: \"kubernetes.io/projected/9790065c-0263-4186-b043-a531f264b880-kube-api-access-8xczg\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367996 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zckqh"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368035 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-dir\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368079 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lvd5\" (UniqueName: \"kubernetes.io/projected/33406000-fb47-484f-a43f-80332a3d82b4-kube-api-access-9lvd5\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368120 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/322bcacd-4b58-46cf-b37e-2ffda3f87b24-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368204 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44be3897-5e75-4a25-b586-79c33d07da2c-trusted-ca\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368247 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368277 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-config\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.368351 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.367338 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44be3897-5e75-4a25-b586-79c33d07da2c-config\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369091 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369060 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-default-certificate\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369155 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369218 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369243 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9zf\" (UniqueName: \"kubernetes.io/projected/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-kube-api-access-wl9zf\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369266 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33406000-fb47-484f-a43f-80332a3d82b4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369303 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ba12591-7bc1-4df8-a9c0-d515689eefe5-metrics-tls\") pod \"dns-operator-744455d44c-rg8c5\" (UID: \"6ba12591-7bc1-4df8-a9c0-d515689eefe5\") " pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369793 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ttggq"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369801 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llpt7\" (UniqueName: \"kubernetes.io/projected/83688763-e6ec-4879-b214-03d2e00da08e-kube-api-access-llpt7\") pod \"cluster-samples-operator-665b6dd947-glntr\" (UID: \"83688763-e6ec-4879-b214-03d2e00da08e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369850 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-audit-policies\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369875 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9z7\" (UniqueName: \"kubernetes.io/projected/7d2c98a4-f3a8-4200-8432-6f68459320ca-kube-api-access-pr9z7\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.369981 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z2bj\" (UniqueName: \"kubernetes.io/projected/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-kube-api-access-8z2bj\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370026 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f1199-bd10-4d84-a488-ffbbe320209a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370067 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94qp\" (UniqueName: \"kubernetes.io/projected/32fa36ec-07ad-4ebe-8e79-3692f621cd37-kube-api-access-b94qp\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370118 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff335e7-91e9-438a-94f2-005a67ddfc2b-config\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370220 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ff335e7-91e9-438a-94f2-005a67ddfc2b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370298 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370358 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-service-ca-bundle\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370410 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74adbb74-aba6-4b36-a89f-d8f09ad1b241-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370438 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145bd263-96be-4598-ad6c-3c9a73b92d26-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370477 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa6610e-0cfe-4a02-be2d-014a2ad84215-config\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370509 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-audit-dir\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370531 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370556 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370580 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsms4\" (UniqueName: \"kubernetes.io/projected/5f171ae3-24fe-42e0-b8dd-0e733fc33381-kube-api-access-dsms4\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370590 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-audit-dir\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370606 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d6sx\" (UniqueName: \"kubernetes.io/projected/d6346af0-261a-48ce-8a89-c7d0d287bdce-kube-api-access-5d6sx\") pod \"multus-admission-controller-857f4d67dd-ttggq\" (UID: \"d6346af0-261a-48ce-8a89-c7d0d287bdce\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370654 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370679 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370723 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/322bcacd-4b58-46cf-b37e-2ffda3f87b24-images\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370728 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44be3897-5e75-4a25-b586-79c33d07da2c-trusted-ca\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370782 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370810 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370835 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-oauth-serving-cert\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370881 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801b8dae-b259-4026-816a-794d9bcc81a4-serving-cert\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370900 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33406000-fb47-484f-a43f-80332a3d82b4-metrics-tls\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.370920 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f1199-bd10-4d84-a488-ffbbe320209a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371133 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrfb\" (UniqueName: \"kubernetes.io/projected/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-kube-api-access-fxrfb\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371141 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371157 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-stats-auth\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371205 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/74adbb74-aba6-4b36-a89f-d8f09ad1b241-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371229 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-service-ca-bundle\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371253 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xcfrl"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371273 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d6346af0-261a-48ce-8a89-c7d0d287bdce-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ttggq\" (UID: \"d6346af0-261a-48ce-8a89-c7d0d287bdce\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371315 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrk6\" (UniqueName: \"kubernetes.io/projected/1fa6610e-0cfe-4a02-be2d-014a2ad84215-kube-api-access-vfrk6\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371370 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44be3897-5e75-4a25-b586-79c33d07da2c-serving-cert\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371396 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-encryption-config\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371495 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fa36ec-07ad-4ebe-8e79-3692f621cd37-serving-cert\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371585 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff335e7-91e9-438a-94f2-005a67ddfc2b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371640 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-etcd-client\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371667 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f171ae3-24fe-42e0-b8dd-0e733fc33381-serving-cert\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.371718 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372367 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372417 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-metrics-certs\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372475 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372488 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33406000-fb47-484f-a43f-80332a3d82b4-trusted-ca\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372517 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-serving-cert\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372544 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-serving-cert\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372566 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74adbb74-aba6-4b36-a89f-d8f09ad1b241-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372578 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372705 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d15f1199-bd10-4d84-a488-ffbbe320209a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372716 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372772 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372828 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322bcacd-4b58-46cf-b37e-2ffda3f87b24-config\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372853 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-serving-cert\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372877 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-audit-dir\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372899 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-client-ca\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.372921 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-service-ca\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373059 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-audit-dir\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373105 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/83688763-e6ec-4879-b214-03d2e00da08e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-glntr\" (UID: \"83688763-e6ec-4879-b214-03d2e00da08e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373146 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373272 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-client-ca\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373309 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-config\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373336 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373358 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373379 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-policies\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373403 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv94q\" (UniqueName: \"kubernetes.io/projected/145bd263-96be-4598-ad6c-3c9a73b92d26-kube-api-access-nv94q\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373426 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1fa6610e-0cfe-4a02-be2d-014a2ad84215-machine-approver-tls\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373477 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.373771 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.374098 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rg8c5"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.374389 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.374526 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-client-ca\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.374679 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.375488 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-policies\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.375574 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.375657 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.376279 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.376786 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.377550 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.378025 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/83688763-e6ec-4879-b214-03d2e00da08e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-glntr\" (UID: \"83688763-e6ec-4879-b214-03d2e00da08e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.378078 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.379230 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44be3897-5e75-4a25-b586-79c33d07da2c-serving-cert\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.379649 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.379703 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.380279 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-encryption-config\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.381050 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f171ae3-24fe-42e0-b8dd-0e733fc33381-serving-cert\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.381088 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.382546 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.382596 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.384742 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-serving-cert\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.384780 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.384804 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gw2wt"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.386253 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.386872 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-879qk"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.389798 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.394551 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.398282 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.398534 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.400623 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.401227 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-etcd-client\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.402852 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gwkcf"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.404149 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mg4vw"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.405174 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gw2wt"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.406291 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.408145 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.409998 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.411126 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sb84b"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.412605 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.413852 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.416667 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-879qk"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.418766 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6q4sx"] Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.419544 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.441099 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.453771 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474124 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74adbb74-aba6-4b36-a89f-d8f09ad1b241-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474160 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145bd263-96be-4598-ad6c-3c9a73b92d26-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474188 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa6610e-0cfe-4a02-be2d-014a2ad84215-config\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474230 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801b8dae-b259-4026-816a-794d9bcc81a4-serving-cert\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474257 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28df97df-6dc9-4670-b14d-eae33c4de87d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474283 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htrdz\" (UniqueName: \"kubernetes.io/projected/5c251ddf-82e0-4cda-8303-ec6e65474e45-kube-api-access-htrdz\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474328 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/74adbb74-aba6-4b36-a89f-d8f09ad1b241-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474351 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prpr\" (UniqueName: \"kubernetes.io/projected/a59a6854-54d3-4b6a-a037-eb212513fa8f-kube-api-access-7prpr\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474356 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.474375 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d6346af0-261a-48ce-8a89-c7d0d287bdce-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ttggq\" (UID: \"d6346af0-261a-48ce-8a89-c7d0d287bdce\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475078 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-key\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475114 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrk6\" (UniqueName: \"kubernetes.io/projected/1fa6610e-0cfe-4a02-be2d-014a2ad84215-kube-api-access-vfrk6\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475135 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff335e7-91e9-438a-94f2-005a67ddfc2b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475170 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-metrics-certs\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475171 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa6610e-0cfe-4a02-be2d-014a2ad84215-config\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475188 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbls\" (UniqueName: \"kubernetes.io/projected/b156efe9-7dba-4006-bbc0-49cd04c42d32-kube-api-access-nvbls\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475211 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74adbb74-aba6-4b36-a89f-d8f09ad1b241-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475227 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475352 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkct2\" (UniqueName: \"kubernetes.io/projected/15a0fb0c-e901-4655-aa08-e52e944cea7c-kube-api-access-nkct2\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475394 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-serving-cert\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475414 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a0fb0c-e901-4655-aa08-e52e944cea7c-metrics-tls\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475438 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322bcacd-4b58-46cf-b37e-2ffda3f87b24-config\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475479 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-service-ca\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475507 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwnb\" (UniqueName: \"kubernetes.io/projected/401b3157-b40f-489c-827f-d4f941e96001-kube-api-access-4zwnb\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475530 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1fa6610e-0cfe-4a02-be2d-014a2ad84215-machine-approver-tls\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475548 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd78m\" (UniqueName: \"kubernetes.io/projected/74adbb74-aba6-4b36-a89f-d8f09ad1b241-kube-api-access-xd78m\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475568 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-srv-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475587 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145bd263-96be-4598-ad6c-3c9a73b92d26-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475606 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa6610e-0cfe-4a02-be2d-014a2ad84215-auth-proxy-config\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475625 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6f9\" (UniqueName: \"kubernetes.io/projected/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-kube-api-access-6s6f9\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475657 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b156efe9-7dba-4006-bbc0-49cd04c42d32-cert\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475683 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5lrs\" (UniqueName: \"kubernetes.io/projected/aa063689-071e-47b5-85d1-2bfe8d3b5fec-kube-api-access-k5lrs\") pod \"downloads-7954f5f757-sx6df\" (UID: \"aa063689-071e-47b5-85d1-2bfe8d3b5fec\") " pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475712 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9790065c-0263-4186-b043-a531f264b880-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475771 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9790065c-0263-4186-b043-a531f264b880-config\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475788 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475822 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-ca\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475840 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzld7\" (UniqueName: \"kubernetes.io/projected/6ba12591-7bc1-4df8-a9c0-d515689eefe5-kube-api-access-kzld7\") pod \"dns-operator-744455d44c-rg8c5\" (UID: \"6ba12591-7bc1-4df8-a9c0-d515689eefe5\") " pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475857 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28df97df-6dc9-4670-b14d-eae33c4de87d-config\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475889 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xczg\" (UniqueName: \"kubernetes.io/projected/9790065c-0263-4186-b043-a531f264b880-kube-api-access-8xczg\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475908 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lvd5\" (UniqueName: \"kubernetes.io/projected/33406000-fb47-484f-a43f-80332a3d82b4-kube-api-access-9lvd5\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475928 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8408fe11-4e94-4069-89bf-9370719c1770-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-z5xtp\" (UID: \"8408fe11-4e94-4069-89bf-9370719c1770\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475960 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9zf\" (UniqueName: \"kubernetes.io/projected/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-kube-api-access-wl9zf\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475978 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ba12591-7bc1-4df8-a9c0-d515689eefe5-metrics-tls\") pod \"dns-operator-744455d44c-rg8c5\" (UID: \"6ba12591-7bc1-4df8-a9c0-d515689eefe5\") " pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.475997 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476022 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2bj\" (UniqueName: \"kubernetes.io/projected/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-kube-api-access-8z2bj\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476040 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f1199-bd10-4d84-a488-ffbbe320209a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476058 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-service-ca-bundle\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476074 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-profile-collector-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476106 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74adbb74-aba6-4b36-a89f-d8f09ad1b241-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476107 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d6sx\" (UniqueName: \"kubernetes.io/projected/d6346af0-261a-48ce-8a89-c7d0d287bdce-kube-api-access-5d6sx\") pod \"multus-admission-controller-857f4d67dd-ttggq\" (UID: \"d6346af0-261a-48ce-8a89-c7d0d287bdce\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476143 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa6610e-0cfe-4a02-be2d-014a2ad84215-auth-proxy-config\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476158 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476182 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33406000-fb47-484f-a43f-80332a3d82b4-metrics-tls\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476202 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f1199-bd10-4d84-a488-ffbbe320209a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476240 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-oauth-serving-cert\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476243 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322bcacd-4b58-46cf-b37e-2ffda3f87b24-config\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476263 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/322bcacd-4b58-46cf-b37e-2ffda3f87b24-images\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476281 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-stats-auth\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476298 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-service-ca-bundle\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476317 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28df97df-6dc9-4670-b14d-eae33c4de87d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476336 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a59a6854-54d3-4b6a-a037-eb212513fa8f-secret-volume\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.476538 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fa36ec-07ad-4ebe-8e79-3692f621cd37-serving-cert\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.477001 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15f1199-bd10-4d84-a488-ffbbe320209a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.477204 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9790065c-0263-4186-b043-a531f264b880-config\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.477316 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-service-ca-bundle\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.477531 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-service-ca-bundle\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.477545 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/322bcacd-4b58-46cf-b37e-2ffda3f87b24-images\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.477762 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/74adbb74-aba6-4b36-a89f-d8f09ad1b241-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478056 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478198 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-oauth-serving-cert\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478631 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33406000-fb47-484f-a43f-80332a3d82b4-trusted-ca\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478651 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801b8dae-b259-4026-816a-794d9bcc81a4-serving-cert\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478660 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-serving-cert\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478708 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-serving-cert\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.478729 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d15f1199-bd10-4d84-a488-ffbbe320209a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479013 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-metrics-certs\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479112 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-client-ca\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479411 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-config\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479626 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv94q\" (UniqueName: \"kubernetes.io/projected/145bd263-96be-4598-ad6c-3c9a73b92d26-kube-api-access-nv94q\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479670 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-client\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479704 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-cabundle\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479758 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15a0fb0c-e901-4655-aa08-e52e944cea7c-config-volume\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479784 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479808 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a59a6854-54d3-4b6a-a037-eb212513fa8f-config-volume\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479834 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-oauth-config\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479858 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-trusted-ca-bundle\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479885 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltdg5\" (UniqueName: \"kubernetes.io/projected/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-kube-api-access-ltdg5\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479918 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt945\" (UniqueName: \"kubernetes.io/projected/322bcacd-4b58-46cf-b37e-2ffda3f87b24-kube-api-access-nt945\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479944 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-config\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479967 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-config\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.479989 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpx77\" (UniqueName: \"kubernetes.io/projected/801b8dae-b259-4026-816a-794d9bcc81a4-kube-api-access-dpx77\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480010 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-client-ca\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480014 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ctpt\" (UniqueName: \"kubernetes.io/projected/8408fe11-4e94-4069-89bf-9370719c1770-kube-api-access-6ctpt\") pod \"package-server-manager-789f6589d5-z5xtp\" (UID: \"8408fe11-4e94-4069-89bf-9370719c1770\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480060 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-stats-auth\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480081 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1fa6610e-0cfe-4a02-be2d-014a2ad84215-machine-approver-tls\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480067 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-service-ca\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480150 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/322bcacd-4b58-46cf-b37e-2ffda3f87b24-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480182 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-config\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480261 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-default-certificate\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480311 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33406000-fb47-484f-a43f-80332a3d82b4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480342 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94qp\" (UniqueName: \"kubernetes.io/projected/32fa36ec-07ad-4ebe-8e79-3692f621cd37-kube-api-access-b94qp\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480368 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff335e7-91e9-438a-94f2-005a67ddfc2b-config\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480411 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ff335e7-91e9-438a-94f2-005a67ddfc2b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480435 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt85s\" (UniqueName: \"kubernetes.io/projected/57c1fbaf-6477-491e-8a43-9f780293b8ae-kube-api-access-jt85s\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480820 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9790065c-0263-4186-b043-a531f264b880-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480849 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-service-ca\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.480339 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-config\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.481236 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.481638 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801b8dae-b259-4026-816a-794d9bcc81a4-config\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.481851 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15f1199-bd10-4d84-a488-ffbbe320209a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.481861 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-config\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.481952 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-serving-cert\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.482352 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-trusted-ca-bundle\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.482737 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-serving-cert\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.483163 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.483283 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fa36ec-07ad-4ebe-8e79-3692f621cd37-serving-cert\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.483701 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-oauth-config\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.484622 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-default-certificate\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.484753 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/322bcacd-4b58-46cf-b37e-2ffda3f87b24-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.493770 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.513802 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.533924 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.540096 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ba12591-7bc1-4df8-a9c0-d515689eefe5-metrics-tls\") pod \"dns-operator-744455d44c-rg8c5\" (UID: \"6ba12591-7bc1-4df8-a9c0-d515689eefe5\") " pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.553965 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.574435 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581066 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkct2\" (UniqueName: \"kubernetes.io/projected/15a0fb0c-e901-4655-aa08-e52e944cea7c-kube-api-access-nkct2\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581103 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581167 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a0fb0c-e901-4655-aa08-e52e944cea7c-metrics-tls\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581193 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwnb\" (UniqueName: \"kubernetes.io/projected/401b3157-b40f-489c-827f-d4f941e96001-kube-api-access-4zwnb\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581216 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-srv-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581243 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b156efe9-7dba-4006-bbc0-49cd04c42d32-cert\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581275 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28df97df-6dc9-4670-b14d-eae33c4de87d-config\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581356 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581380 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8408fe11-4e94-4069-89bf-9370719c1770-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-z5xtp\" (UID: \"8408fe11-4e94-4069-89bf-9370719c1770\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581422 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-profile-collector-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581469 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28df97df-6dc9-4670-b14d-eae33c4de87d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581487 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a59a6854-54d3-4b6a-a037-eb212513fa8f-secret-volume\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581526 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-cabundle\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581547 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a59a6854-54d3-4b6a-a037-eb212513fa8f-config-volume\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581563 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15a0fb0c-e901-4655-aa08-e52e944cea7c-config-volume\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581604 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ctpt\" (UniqueName: \"kubernetes.io/projected/8408fe11-4e94-4069-89bf-9370719c1770-kube-api-access-6ctpt\") pod \"package-server-manager-789f6589d5-z5xtp\" (UID: \"8408fe11-4e94-4069-89bf-9370719c1770\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581722 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt85s\" (UniqueName: \"kubernetes.io/projected/57c1fbaf-6477-491e-8a43-9f780293b8ae-kube-api-access-jt85s\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581778 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28df97df-6dc9-4670-b14d-eae33c4de87d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581800 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htrdz\" (UniqueName: \"kubernetes.io/projected/5c251ddf-82e0-4cda-8303-ec6e65474e45-kube-api-access-htrdz\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581817 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prpr\" (UniqueName: \"kubernetes.io/projected/a59a6854-54d3-4b6a-a037-eb212513fa8f-kube-api-access-7prpr\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581850 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-key\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.581885 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbls\" (UniqueName: \"kubernetes.io/projected/b156efe9-7dba-4006-bbc0-49cd04c42d32-kube-api-access-nvbls\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.594286 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.598091 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145bd263-96be-4598-ad6c-3c9a73b92d26-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.614227 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.633891 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.637208 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145bd263-96be-4598-ad6c-3c9a73b92d26-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.653386 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.674240 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.693918 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.714042 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.722216 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff335e7-91e9-438a-94f2-005a67ddfc2b-config\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.733595 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.739784 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff335e7-91e9-438a-94f2-005a67ddfc2b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.754632 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.768240 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d6346af0-261a-48ce-8a89-c7d0d287bdce-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ttggq\" (UID: \"d6346af0-261a-48ce-8a89-c7d0d287bdce\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.776698 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.795471 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.796232 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-service-ca\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.815363 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.823227 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-client\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.834228 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.854127 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.873968 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.879912 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-serving-cert\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.894618 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.914408 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.935294 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.953644 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.962158 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-config\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.974400 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.985432 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28df97df-6dc9-4670-b14d-eae33c4de87d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.994801 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 22 14:06:28 crc kubenswrapper[4801]: I0122 14:06:28.997223 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-etcd-ca\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.014258 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.022528 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28df97df-6dc9-4670-b14d-eae33c4de87d-config\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.033848 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.053492 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.074136 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.094633 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.114077 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.120613 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33406000-fb47-484f-a43f-80332a3d82b4-metrics-tls\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.144565 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.151465 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33406000-fb47-484f-a43f-80332a3d82b4-trusted-ca\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.154251 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.173525 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.194172 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.213223 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.233175 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.253515 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.274771 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.293943 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.312149 4801 request.go:700] Waited for 1.001508119s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.313755 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.325085 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-profile-collector-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.325906 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a59a6854-54d3-4b6a-a037-eb212513fa8f-secret-volume\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.333048 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.354116 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.362628 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a59a6854-54d3-4b6a-a037-eb212513fa8f-config-volume\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.369384 4801 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.369427 4801 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.369542 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:29.86950261 +0000 UTC m=+138.571402833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.369583 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:29.869561461 +0000 UTC m=+138.571461694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.370534 4801 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.370721 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:29.870691254 +0000 UTC m=+138.572591517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.371609 4801 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.371662 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:29.871648211 +0000 UTC m=+138.573548474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.371955 4801 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.372113 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:29.872092714 +0000 UTC m=+138.573992967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.374568 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.394603 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.414569 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.433765 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.445224 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8408fe11-4e94-4069-89bf-9370719c1770-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-z5xtp\" (UID: \"8408fe11-4e94-4069-89bf-9370719c1770\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.453381 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.473380 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.494543 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.513746 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.535118 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.553575 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.573733 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582200 4801 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582230 4801 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582309 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca podName:401b3157-b40f-489c-827f-d4f941e96001 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.082280629 +0000 UTC m=+138.784180812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca") pod "marketplace-operator-79b997595-mg4vw" (UID: "401b3157-b40f-489c-827f-d4f941e96001") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582322 4801 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582334 4801 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582337 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-key podName:5c251ddf-82e0-4cda-8303-ec6e65474e45 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.08232632 +0000 UTC m=+138.784226713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-key") pod "service-ca-9c57cc56f-sb84b" (UID: "5c251ddf-82e0-4cda-8303-ec6e65474e45") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582564 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a0fb0c-e901-4655-aa08-e52e944cea7c-metrics-tls podName:15a0fb0c-e901-4655-aa08-e52e944cea7c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.082523556 +0000 UTC m=+138.784423769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15a0fb0c-e901-4655-aa08-e52e944cea7c-metrics-tls") pod "dns-default-gw2wt" (UID: "15a0fb0c-e901-4655-aa08-e52e944cea7c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582352 4801 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582588 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b156efe9-7dba-4006-bbc0-49cd04c42d32-cert podName:b156efe9-7dba-4006-bbc0-49cd04c42d32 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.082576367 +0000 UTC m=+138.784476580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b156efe9-7dba-4006-bbc0-49cd04c42d32-cert") pod "ingress-canary-gwkcf" (UID: "b156efe9-7dba-4006-bbc0-49cd04c42d32") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582674 4801 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582691 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-srv-cert podName:57c1fbaf-6477-491e-8a43-9f780293b8ae nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.08266953 +0000 UTC m=+138.784569883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-srv-cert") pod "catalog-operator-68c6474976-q2f25" (UID: "57c1fbaf-6477-491e-8a43-9f780293b8ae") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582692 4801 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582715 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15a0fb0c-e901-4655-aa08-e52e944cea7c-config-volume podName:15a0fb0c-e901-4655-aa08-e52e944cea7c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.082703701 +0000 UTC m=+138.784603914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/15a0fb0c-e901-4655-aa08-e52e944cea7c-config-volume") pod "dns-default-gw2wt" (UID: "15a0fb0c-e901-4655-aa08-e52e944cea7c") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582800 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics podName:401b3157-b40f-489c-827f-d4f941e96001 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.082789774 +0000 UTC m=+138.784689957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics") pod "marketplace-operator-79b997595-mg4vw" (UID: "401b3157-b40f-489c-827f-d4f941e96001") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.582226 4801 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: E0122 14:06:29.583586 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-cabundle podName:5c251ddf-82e0-4cda-8303-ec6e65474e45 nodeName:}" failed. No retries permitted until 2026-01-22 14:06:30.083543095 +0000 UTC m=+138.785443468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-cabundle") pod "service-ca-9c57cc56f-sb84b" (UID: "5c251ddf-82e0-4cda-8303-ec6e65474e45") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.595038 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.613625 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.635178 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.655560 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.674272 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.693974 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.714584 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.733810 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.754050 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.774577 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.794616 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.813581 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.833441 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.854059 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.874260 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.894231 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.906398 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.906742 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.906955 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.907048 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.907763 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.914039 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.940508 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.953996 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.974359 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 22 14:06:29 crc kubenswrapper[4801]: I0122 14:06:29.994614 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.013388 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.034491 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.054363 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.092394 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpm4\" (UniqueName: \"kubernetes.io/projected/44be3897-5e75-4a25-b586-79c33d07da2c-kube-api-access-bvpm4\") pod \"console-operator-58897d9998-g5mt6\" (UID: \"44be3897-5e75-4a25-b586-79c33d07da2c\") " pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.109995 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2cn5\" (UniqueName: \"kubernetes.io/projected/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-kube-api-access-p2cn5\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.110650 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-key\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.110726 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.110773 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a0fb0c-e901-4655-aa08-e52e944cea7c-metrics-tls\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.110819 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-srv-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.110860 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b156efe9-7dba-4006-bbc0-49cd04c42d32-cert\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.110998 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.111128 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-cabundle\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.111160 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15a0fb0c-e901-4655-aa08-e52e944cea7c-config-volume\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.113407 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-cabundle\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.114548 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.114944 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.117167 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c251ddf-82e0-4cda-8303-ec6e65474e45-signing-key\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.118659 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/57c1fbaf-6477-491e-8a43-9f780293b8ae-srv-cert\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.119243 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b156efe9-7dba-4006-bbc0-49cd04c42d32-cert\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.130513 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llpt7\" (UniqueName: \"kubernetes.io/projected/83688763-e6ec-4879-b214-03d2e00da08e-kube-api-access-llpt7\") pod \"cluster-samples-operator-665b6dd947-glntr\" (UID: \"83688763-e6ec-4879-b214-03d2e00da08e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.152188 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9z7\" (UniqueName: \"kubernetes.io/projected/7d2c98a4-f3a8-4200-8432-6f68459320ca-kube-api-access-pr9z7\") pod \"oauth-openshift-558db77b4-zxs7c\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.172059 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsms4\" (UniqueName: \"kubernetes.io/projected/5f171ae3-24fe-42e0-b8dd-0e733fc33381-kube-api-access-dsms4\") pod \"route-controller-manager-6576b87f9c-m4plz\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.189886 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrfb\" (UniqueName: \"kubernetes.io/projected/c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea-kube-api-access-fxrfb\") pod \"apiserver-7bbb656c7d-jxp9w\" (UID: \"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.194212 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.205340 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a0fb0c-e901-4655-aa08-e52e944cea7c-metrics-tls\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.213930 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.222307 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15a0fb0c-e901-4655-aa08-e52e944cea7c-config-volume\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.235200 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.253649 4801 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.273711 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.293376 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.294056 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.312319 4801 request.go:700] Waited for 1.902292771s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.315625 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.341409 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.353996 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.372858 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.374291 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.391708 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.395075 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.442011 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74adbb74-aba6-4b36-a89f-d8f09ad1b241-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.460481 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrk6\" (UniqueName: \"kubernetes.io/projected/1fa6610e-0cfe-4a02-be2d-014a2ad84215-kube-api-access-vfrk6\") pod \"machine-approver-56656f9798-s84ts\" (UID: \"1fa6610e-0cfe-4a02-be2d-014a2ad84215\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.477931 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd78m\" (UniqueName: \"kubernetes.io/projected/74adbb74-aba6-4b36-a89f-d8f09ad1b241-kube-api-access-xd78m\") pod \"cluster-image-registry-operator-dc59b4c8b-zvlsl\" (UID: \"74adbb74-aba6-4b36-a89f-d8f09ad1b241\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.499147 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d6sx\" (UniqueName: \"kubernetes.io/projected/d6346af0-261a-48ce-8a89-c7d0d287bdce-kube-api-access-5d6sx\") pod \"multus-admission-controller-857f4d67dd-ttggq\" (UID: \"d6346af0-261a-48ce-8a89-c7d0d287bdce\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.504346 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zxs7c"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.515842 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9zf\" (UniqueName: \"kubernetes.io/projected/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-kube-api-access-wl9zf\") pod \"console-f9d7485db-nbhfx\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.536159 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z2bj\" (UniqueName: \"kubernetes.io/projected/b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb-kube-api-access-8z2bj\") pod \"openshift-config-operator-7777fb866f-fpc52\" (UID: \"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.546160 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-g5mt6"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.548016 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.551993 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lvd5\" (UniqueName: \"kubernetes.io/projected/33406000-fb47-484f-a43f-80332a3d82b4-kube-api-access-9lvd5\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:30 crc kubenswrapper[4801]: W0122 14:06:30.555191 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44be3897_5e75_4a25_b586_79c33d07da2c.slice/crio-c6a97be975ff82d4983733e96e1378b375c46f4697adaac5774c111d6c4d543a WatchSource:0}: Error finding container c6a97be975ff82d4983733e96e1378b375c46f4697adaac5774c111d6c4d543a: Status 404 returned error can't find the container with id c6a97be975ff82d4983733e96e1378b375c46f4697adaac5774c111d6c4d543a Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.567193 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xczg\" (UniqueName: \"kubernetes.io/projected/9790065c-0263-4186-b043-a531f264b880-kube-api-access-8xczg\") pod \"openshift-apiserver-operator-796bbdcf4f-845tw\" (UID: \"9790065c-0263-4186-b043-a531f264b880\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.570899 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.579485 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.595516 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzld7\" (UniqueName: \"kubernetes.io/projected/6ba12591-7bc1-4df8-a9c0-d515689eefe5-kube-api-access-kzld7\") pod \"dns-operator-744455d44c-rg8c5\" (UID: \"6ba12591-7bc1-4df8-a9c0-d515689eefe5\") " pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.615191 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5lrs\" (UniqueName: \"kubernetes.io/projected/aa063689-071e-47b5-85d1-2bfe8d3b5fec-kube-api-access-k5lrs\") pod \"downloads-7954f5f757-sx6df\" (UID: \"aa063689-071e-47b5-85d1-2bfe8d3b5fec\") " pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.625939 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.631958 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6f9\" (UniqueName: \"kubernetes.io/projected/0aad65ae-dbd7-48ef-a84e-8f60be1848c4-kube-api-access-6s6f9\") pod \"etcd-operator-b45778765-xcfrl\" (UID: \"0aad65ae-dbd7-48ef-a84e-8f60be1848c4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.646078 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.655019 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d15f1199-bd10-4d84-a488-ffbbe320209a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mhnvv\" (UID: \"d15f1199-bd10-4d84-a488-ffbbe320209a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:30 crc kubenswrapper[4801]: W0122 14:06:30.656417 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f171ae3_24fe_42e0_b8dd_0e733fc33381.slice/crio-76a71ae39948c7835c8292e766aae647ee0edb8b681e9e3e5140e03fe8910381 WatchSource:0}: Error finding container 76a71ae39948c7835c8292e766aae647ee0edb8b681e9e3e5140e03fe8910381: Status 404 returned error can't find the container with id 76a71ae39948c7835c8292e766aae647ee0edb8b681e9e3e5140e03fe8910381 Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.673504 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv94q\" (UniqueName: \"kubernetes.io/projected/145bd263-96be-4598-ad6c-3c9a73b92d26-kube-api-access-nv94q\") pod \"openshift-controller-manager-operator-756b6f6bc6-l8prd\" (UID: \"145bd263-96be-4598-ad6c-3c9a73b92d26\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.702062 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltdg5\" (UniqueName: \"kubernetes.io/projected/9f30eedd-eee4-43f7-a1dd-4614580c7f0a-kube-api-access-ltdg5\") pod \"router-default-5444994796-57pdz\" (UID: \"9f30eedd-eee4-43f7-a1dd-4614580c7f0a\") " pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.705603 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.713435 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.718654 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt945\" (UniqueName: \"kubernetes.io/projected/322bcacd-4b58-46cf-b37e-2ffda3f87b24-kube-api-access-nt945\") pod \"machine-api-operator-5694c8668f-tzbl4\" (UID: \"322bcacd-4b58-46cf-b37e-2ffda3f87b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.729213 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpx77\" (UniqueName: \"kubernetes.io/projected/801b8dae-b259-4026-816a-794d9bcc81a4-kube-api-access-dpx77\") pod \"authentication-operator-69f744f599-glbkg\" (UID: \"801b8dae-b259-4026-816a-794d9bcc81a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.740890 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.750029 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.753745 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ff335e7-91e9-438a-94f2-005a67ddfc2b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dd4gg\" (UID: \"1ff335e7-91e9-438a-94f2-005a67ddfc2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.767711 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.769051 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33406000-fb47-484f-a43f-80332a3d82b4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-d9f9k\" (UID: \"33406000-fb47-484f-a43f-80332a3d82b4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.787088 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ttggq"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.790222 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94qp\" (UniqueName: \"kubernetes.io/projected/32fa36ec-07ad-4ebe-8e79-3692f621cd37-kube-api-access-b94qp\") pod \"controller-manager-879f6c89f-6ddc9\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:30 crc kubenswrapper[4801]: W0122 14:06:30.801776 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6346af0_261a_48ce_8a89_c7d0d287bdce.slice/crio-b1973564d60c3984c7ad62bf02ac4ffbd61e211c4d69d0511100418875025714 WatchSource:0}: Error finding container b1973564d60c3984c7ad62bf02ac4ffbd61e211c4d69d0511100418875025714: Status 404 returned error can't find the container with id b1973564d60c3984c7ad62bf02ac4ffbd61e211c4d69d0511100418875025714 Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.807536 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkct2\" (UniqueName: \"kubernetes.io/projected/15a0fb0c-e901-4655-aa08-e52e944cea7c-kube-api-access-nkct2\") pod \"dns-default-gw2wt\" (UID: \"15a0fb0c-e901-4655-aa08-e52e944cea7c\") " pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.811318 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.817776 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.825828 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.829995 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwnb\" (UniqueName: \"kubernetes.io/projected/401b3157-b40f-489c-827f-d4f941e96001-kube-api-access-4zwnb\") pod \"marketplace-operator-79b997595-mg4vw\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.832547 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.838553 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.852011 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28df97df-6dc9-4670-b14d-eae33c4de87d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nsgnh\" (UID: \"28df97df-6dc9-4670-b14d-eae33c4de87d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.855013 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.863934 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.868623 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prpr\" (UniqueName: \"kubernetes.io/projected/a59a6854-54d3-4b6a-a037-eb212513fa8f-kube-api-access-7prpr\") pod \"collect-profiles-29484840-xzxsv\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.871368 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.892059 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ctpt\" (UniqueName: \"kubernetes.io/projected/8408fe11-4e94-4069-89bf-9370719c1770-kube-api-access-6ctpt\") pod \"package-server-manager-789f6589d5-z5xtp\" (UID: \"8408fe11-4e94-4069-89bf-9370719c1770\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.906719 4801 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.906839 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:31.906814268 +0000 UTC m=+140.608714441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.907182 4801 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.907221 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:31.907208559 +0000 UTC m=+140.609108732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync configmap cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.907222 4801 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.907273 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:31.907262841 +0000 UTC m=+140.609163024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.907299 4801 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.907330 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:31.907322963 +0000 UTC m=+140.609223146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.908886 4801 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: E0122 14:06:30.908977 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config podName:eb7d48ad-03ae-4594-8c04-d9afa9dc453c nodeName:}" failed. No retries permitted until 2026-01-22 14:06:31.908953869 +0000 UTC m=+140.610854042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config") pod "apiserver-76f77b778f-vcjmg" (UID: "eb7d48ad-03ae-4594-8c04-d9afa9dc453c") : failed to sync secret cache: timed out waiting for the condition Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.912844 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.916514 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htrdz\" (UniqueName: \"kubernetes.io/projected/5c251ddf-82e0-4cda-8303-ec6e65474e45-kube-api-access-htrdz\") pod \"service-ca-9c57cc56f-sb84b\" (UID: \"5c251ddf-82e0-4cda-8303-ec6e65474e45\") " pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.923621 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sx6df"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.925577 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.929166 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt85s\" (UniqueName: \"kubernetes.io/projected/57c1fbaf-6477-491e-8a43-9f780293b8ae-kube-api-access-jt85s\") pod \"catalog-operator-68c6474976-q2f25\" (UID: \"57c1fbaf-6477-491e-8a43-9f780293b8ae\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.942287 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.944755 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.950411 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbls\" (UniqueName: \"kubernetes.io/projected/b156efe9-7dba-4006-bbc0-49cd04c42d32-kube-api-access-nvbls\") pod \"ingress-canary-gwkcf\" (UID: \"b156efe9-7dba-4006-bbc0-49cd04c42d32\") " pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.953868 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.954214 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw"] Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.959075 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.974280 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.976841 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:30 crc kubenswrapper[4801]: I0122 14:06:30.993436 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.003762 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.013373 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.034434 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.042273 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.055812 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwkcf" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.060262 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.060544 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.065937 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.481810 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" event={"ID":"1fa6610e-0cfe-4a02-be2d-014a2ad84215","Type":"ContainerStarted","Data":"bab05e8a172b695cdbedeba67d956837e4768624729b194c39733f4a54ed9dfe"} Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.483085 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" event={"ID":"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea","Type":"ContainerStarted","Data":"30bb7b4c0c5a57d59ea7d3793aead9196bc93a6dacb6d869cd4e5416ac69a115"} Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.483211 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" event={"ID":"44be3897-5e75-4a25-b586-79c33d07da2c","Type":"ContainerStarted","Data":"c6a97be975ff82d4983733e96e1378b375c46f4697adaac5774c111d6c4d543a"} Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.483899 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" event={"ID":"5f171ae3-24fe-42e0-b8dd-0e733fc33381","Type":"ContainerStarted","Data":"76a71ae39948c7835c8292e766aae647ee0edb8b681e9e3e5140e03fe8910381"} Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.484674 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" event={"ID":"d6346af0-261a-48ce-8a89-c7d0d287bdce","Type":"ContainerStarted","Data":"b1973564d60c3984c7ad62bf02ac4ffbd61e211c4d69d0511100418875025714"} Jan 22 14:06:31 crc kubenswrapper[4801]: I0122 14:06:31.485430 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" event={"ID":"7d2c98a4-f3a8-4200-8432-6f68459320ca","Type":"ContainerStarted","Data":"8f0f8c6b3f492cd73e02ba02e0cc4d3015f05b99f6b236dee54fb361caa9f569"} Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.359784 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360162 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25f78c50-1bd7-4abd-b231-a932aa15f2af-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360219 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360250 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360282 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360319 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360352 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.360396 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-tls\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.363732 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:32.86366561 +0000 UTC m=+141.565565813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.369515 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.370007 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-image-import-ca\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: W0122 14:06:32.375418 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa063689_071e_47b5_85d1_2bfe8d3b5fec.slice/crio-6b8ab049e15cb1df9731a995248e66295fee02b0ba30ef6fa9bc1cd40f19e7b8 WatchSource:0}: Error finding container 6b8ab049e15cb1df9731a995248e66295fee02b0ba30ef6fa9bc1cd40f19e7b8: Status 404 returned error can't find the container with id 6b8ab049e15cb1df9731a995248e66295fee02b0ba30ef6fa9bc1cd40f19e7b8 Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.379293 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-serving-cert\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.389670 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-encryption-config\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.396543 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb7d48ad-03ae-4594-8c04-d9afa9dc453c-etcd-client\") pod \"apiserver-76f77b778f-vcjmg\" (UID: \"eb7d48ad-03ae-4594-8c04-d9afa9dc453c\") " pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.432313 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461105 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461273 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-registration-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461315 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461361 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25f78c50-1bd7-4abd-b231-a932aa15f2af-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461408 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe863e9d-682c-4c71-b6b5-901ced4fcf87-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461442 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hsv\" (UniqueName: \"kubernetes.io/projected/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-kube-api-access-l6hsv\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461500 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmgm7\" (UniqueName: \"kubernetes.io/projected/3c000bea-d0aa-4b12-941a-cf3b8db64b40-kube-api-access-fmgm7\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461525 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjfb\" (UniqueName: \"kubernetes.io/projected/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-kube-api-access-jrjfb\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461574 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-csi-data-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461598 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-bound-sa-token\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461619 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/708a3ae8-4d39-4cb3-b465-6da7854c28fd-proxy-tls\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461678 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-tls\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461712 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqz8s\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-kube-api-access-tqz8s\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461734 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-mountpoint-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461757 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-apiservice-cert\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461808 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnr9b\" (UniqueName: \"kubernetes.io/projected/f0d5b3cc-c956-48f1-8fd8-66e815f6ae05-kube-api-access-fnr9b\") pod \"migrator-59844c95c7-wl6g5\" (UID: \"f0d5b3cc-c956-48f1-8fd8-66e815f6ae05\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461835 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25f78c50-1bd7-4abd-b231-a932aa15f2af-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461856 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-webhook-cert\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461933 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-proxy-tls\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461955 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6b6\" (UniqueName: \"kubernetes.io/projected/fe863e9d-682c-4c71-b6b5-901ced4fcf87-kube-api-access-sb6b6\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.461982 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-plugins-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462000 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-tmpfs\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462035 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crm4v\" (UniqueName: \"kubernetes.io/projected/c319403b-3764-4420-8e53-06fb93e21a23-kube-api-access-crm4v\") pod \"control-plane-machine-set-operator-78cbb6b69f-bpwfq\" (UID: \"c319403b-3764-4420-8e53-06fb93e21a23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462058 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2pbh\" (UniqueName: \"kubernetes.io/projected/708a3ae8-4d39-4cb3-b465-6da7854c28fd-kube-api-access-q2pbh\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462182 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/708a3ae8-4d39-4cb3-b465-6da7854c28fd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462269 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-socket-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462296 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-certificates\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462320 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c319403b-3764-4420-8e53-06fb93e21a23-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bpwfq\" (UID: \"c319403b-3764-4420-8e53-06fb93e21a23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462364 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c000bea-d0aa-4b12-941a-cf3b8db64b40-config\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462386 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-srv-cert\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462489 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpqbt\" (UniqueName: \"kubernetes.io/projected/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-kube-api-access-hpqbt\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462534 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-trusted-ca\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462556 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zfqs\" (UniqueName: \"kubernetes.io/projected/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-kube-api-access-5zfqs\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462576 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c000bea-d0aa-4b12-941a-cf3b8db64b40-serving-cert\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462601 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462625 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/708a3ae8-4d39-4cb3-b465-6da7854c28fd-images\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.462645 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe863e9d-682c-4c71-b6b5-901ced4fcf87-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.462768 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:32.96274867 +0000 UTC m=+141.664648853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.464502 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25f78c50-1bd7-4abd-b231-a932aa15f2af-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.473061 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-tls\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.499028 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sx6df" event={"ID":"aa063689-071e-47b5-85d1-2bfe8d3b5fec","Type":"ContainerStarted","Data":"6b8ab049e15cb1df9731a995248e66295fee02b0ba30ef6fa9bc1cd40f19e7b8"} Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.505816 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" event={"ID":"74adbb74-aba6-4b36-a89f-d8f09ad1b241","Type":"ContainerStarted","Data":"2ac7161c83aca66e6468dc637977dfafb8b4c5ea52b7ba36051bb4afbc2d81ec"} Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.510374 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" event={"ID":"9790065c-0263-4186-b043-a531f264b880","Type":"ContainerStarted","Data":"cea3102af286cf30d280f7e025cc0dc37187c1f76155b3eca46310f6dd8013b9"} Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566252 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjfb\" (UniqueName: \"kubernetes.io/projected/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-kube-api-access-jrjfb\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566705 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-csi-data-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566763 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/708a3ae8-4d39-4cb3-b465-6da7854c28fd-proxy-tls\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566801 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-bound-sa-token\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566844 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-mountpoint-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566876 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqz8s\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-kube-api-access-tqz8s\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566875 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-csi-data-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.566894 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-apiservice-cert\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567063 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnr9b\" (UniqueName: \"kubernetes.io/projected/f0d5b3cc-c956-48f1-8fd8-66e815f6ae05-kube-api-access-fnr9b\") pod \"migrator-59844c95c7-wl6g5\" (UID: \"f0d5b3cc-c956-48f1-8fd8-66e815f6ae05\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567093 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25f78c50-1bd7-4abd-b231-a932aa15f2af-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567120 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-webhook-cert\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567170 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-proxy-tls\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567197 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6b6\" (UniqueName: \"kubernetes.io/projected/fe863e9d-682c-4c71-b6b5-901ced4fcf87-kube-api-access-sb6b6\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567219 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-plugins-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567239 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-tmpfs\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567265 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2pbh\" (UniqueName: \"kubernetes.io/projected/708a3ae8-4d39-4cb3-b465-6da7854c28fd-kube-api-access-q2pbh\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567294 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crm4v\" (UniqueName: \"kubernetes.io/projected/c319403b-3764-4420-8e53-06fb93e21a23-kube-api-access-crm4v\") pod \"control-plane-machine-set-operator-78cbb6b69f-bpwfq\" (UID: \"c319403b-3764-4420-8e53-06fb93e21a23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567496 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-certs\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567546 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/708a3ae8-4d39-4cb3-b465-6da7854c28fd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567632 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-socket-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567686 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-certificates\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567739 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c319403b-3764-4420-8e53-06fb93e21a23-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bpwfq\" (UID: \"c319403b-3764-4420-8e53-06fb93e21a23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567804 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c000bea-d0aa-4b12-941a-cf3b8db64b40-config\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567829 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-srv-cert\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567877 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpqbt\" (UniqueName: \"kubernetes.io/projected/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-kube-api-access-hpqbt\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567916 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-node-bootstrap-token\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567948 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-trusted-ca\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567975 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zfqs\" (UniqueName: \"kubernetes.io/projected/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-kube-api-access-5zfqs\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.567998 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c000bea-d0aa-4b12-941a-cf3b8db64b40-serving-cert\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568056 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568083 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe863e9d-682c-4c71-b6b5-901ced4fcf87-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568111 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/708a3ae8-4d39-4cb3-b465-6da7854c28fd-images\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568171 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-registration-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568213 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52h9w\" (UniqueName: \"kubernetes.io/projected/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-kube-api-access-52h9w\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568253 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568370 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-socket-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568378 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568410 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe863e9d-682c-4c71-b6b5-901ced4fcf87-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.568664 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.068649726 +0000 UTC m=+141.770549979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.568936 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe863e9d-682c-4c71-b6b5-901ced4fcf87-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.569868 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-plugins-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.570064 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-certificates\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.570573 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/708a3ae8-4d39-4cb3-b465-6da7854c28fd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.570612 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmgm7\" (UniqueName: \"kubernetes.io/projected/3c000bea-d0aa-4b12-941a-cf3b8db64b40-kube-api-access-fmgm7\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.570634 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hsv\" (UniqueName: \"kubernetes.io/projected/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-kube-api-access-l6hsv\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.571057 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/708a3ae8-4d39-4cb3-b465-6da7854c28fd-images\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.571477 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.571899 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-registration-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.572138 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c000bea-d0aa-4b12-941a-cf3b8db64b40-config\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.572521 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-mountpoint-dir\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.574697 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c000bea-d0aa-4b12-941a-cf3b8db64b40-serving-cert\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.575544 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/708a3ae8-4d39-4cb3-b465-6da7854c28fd-proxy-tls\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.576079 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-proxy-tls\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.578242 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25f78c50-1bd7-4abd-b231-a932aa15f2af-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.578363 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-trusted-ca\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.581802 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe863e9d-682c-4c71-b6b5-901ced4fcf87-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.582184 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-srv-cert\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.589163 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqz8s\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-kube-api-access-tqz8s\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.591907 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.592511 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c319403b-3764-4420-8e53-06fb93e21a23-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bpwfq\" (UID: \"c319403b-3764-4420-8e53-06fb93e21a23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.599426 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpqbt\" (UniqueName: \"kubernetes.io/projected/a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9-kube-api-access-hpqbt\") pod \"olm-operator-6b444d44fb-nrqhg\" (UID: \"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.601095 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnr9b\" (UniqueName: \"kubernetes.io/projected/f0d5b3cc-c956-48f1-8fd8-66e815f6ae05-kube-api-access-fnr9b\") pod \"migrator-59844c95c7-wl6g5\" (UID: \"f0d5b3cc-c956-48f1-8fd8-66e815f6ae05\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.601654 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6b6\" (UniqueName: \"kubernetes.io/projected/fe863e9d-682c-4c71-b6b5-901ced4fcf87-kube-api-access-sb6b6\") pod \"kube-storage-version-migrator-operator-b67b599dd-nqdqv\" (UID: \"fe863e9d-682c-4c71-b6b5-901ced4fcf87\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.605907 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjfb\" (UniqueName: \"kubernetes.io/projected/53ee297f-b387-48f0-ae2f-4eb9bb7dae2c-kube-api-access-jrjfb\") pod \"machine-config-controller-84d6567774-kklp6\" (UID: \"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.606545 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-bound-sa-token\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.606725 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2pbh\" (UniqueName: \"kubernetes.io/projected/708a3ae8-4d39-4cb3-b465-6da7854c28fd-kube-api-access-q2pbh\") pod \"machine-config-operator-74547568cd-thrzc\" (UID: \"708a3ae8-4d39-4cb3-b465-6da7854c28fd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.609387 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmgm7\" (UniqueName: \"kubernetes.io/projected/3c000bea-d0aa-4b12-941a-cf3b8db64b40-kube-api-access-fmgm7\") pod \"service-ca-operator-777779d784-j5l5v\" (UID: \"3c000bea-d0aa-4b12-941a-cf3b8db64b40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.612907 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crm4v\" (UniqueName: \"kubernetes.io/projected/c319403b-3764-4420-8e53-06fb93e21a23-kube-api-access-crm4v\") pod \"control-plane-machine-set-operator-78cbb6b69f-bpwfq\" (UID: \"c319403b-3764-4420-8e53-06fb93e21a23\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.617858 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-tmpfs\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.624061 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-apiservice-cert\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.624846 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hsv\" (UniqueName: \"kubernetes.io/projected/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-kube-api-access-l6hsv\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.625151 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zfqs\" (UniqueName: \"kubernetes.io/projected/bcd466a4-1fcd-47e6-9a9c-fae349c4806c-kube-api-access-5zfqs\") pod \"csi-hostpathplugin-879qk\" (UID: \"bcd466a4-1fcd-47e6-9a9c-fae349c4806c\") " pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.626773 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19ca04fb-2b66-4dfb-b760-6cd54c1eef1b-webhook-cert\") pod \"packageserver-d55dfcdfc-sfjhw\" (UID: \"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.690176 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.690410 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.190373455 +0000 UTC m=+141.892273648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.690878 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-node-bootstrap-token\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.690963 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52h9w\" (UniqueName: \"kubernetes.io/projected/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-kube-api-access-52h9w\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.691045 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.691643 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.191633391 +0000 UTC m=+141.893533574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.695104 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-certs\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.698951 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-node-bootstrap-token\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.702490 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-certs\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.716462 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xcfrl"] Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.728791 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52h9w\" (UniqueName: \"kubernetes.io/projected/0965a214-4ab2-4160-9b4e-afea3ebf7f2f-kube-api-access-52h9w\") pod \"machine-config-server-6q4sx\" (UID: \"0965a214-4ab2-4160-9b4e-afea3ebf7f2f\") " pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.733507 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.752724 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.770639 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg"] Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.770864 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" Jan 22 14:06:32 crc kubenswrapper[4801]: W0122 14:06:32.781889 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aad65ae_dbd7_48ef_a84e_8f60be1848c4.slice/crio-d9809f19824f10a957872e101e80e9c90f5a4041ea35ea4df3ca57734b247c0a WatchSource:0}: Error finding container d9809f19824f10a957872e101e80e9c90f5a4041ea35ea4df3ca57734b247c0a: Status 404 returned error can't find the container with id d9809f19824f10a957872e101e80e9c90f5a4041ea35ea4df3ca57734b247c0a Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.785096 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.803479 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.804031 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.304011513 +0000 UTC m=+142.005911696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.814747 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.823788 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.832520 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.849930 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.893249 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-879qk" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.899761 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6q4sx" Jan 22 14:06:32 crc kubenswrapper[4801]: I0122 14:06:32.908697 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:32 crc kubenswrapper[4801]: E0122 14:06:32.909077 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.409063754 +0000 UTC m=+142.110963947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.010363 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.010996 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.510976495 +0000 UTC m=+142.212876688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.113022 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.113390 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.613374641 +0000 UTC m=+142.315274824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.214828 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.215179 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.715156328 +0000 UTC m=+142.417056511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.321319 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.322337 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.822325431 +0000 UTC m=+142.524225614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.423007 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.423776 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:33.923748388 +0000 UTC m=+142.625648591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.521154 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" event={"ID":"1fa6610e-0cfe-4a02-be2d-014a2ad84215","Type":"ContainerStarted","Data":"31db8be78517a4d75faccca5a39d11d713261a3a023a2da99b3d5de927617bed"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.522381 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6q4sx" event={"ID":"0965a214-4ab2-4160-9b4e-afea3ebf7f2f","Type":"ContainerStarted","Data":"5543fb0451aa66e17de58efc370247868d787fccea72f3f5b71483c5b894506e"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.524585 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.524866 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.024855696 +0000 UTC m=+142.726755879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.530042 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" event={"ID":"5f171ae3-24fe-42e0-b8dd-0e733fc33381","Type":"ContainerStarted","Data":"deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.531092 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.543890 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" event={"ID":"74adbb74-aba6-4b36-a89f-d8f09ad1b241","Type":"ContainerStarted","Data":"3225bf4c243959917911309646465e26317884f0e7fb0de80372e443fda7ad75"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.548836 4801 generic.go:334] "Generic (PLEG): container finished" podID="c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea" containerID="a24e23c01e6fa34488e5a3b97f1c702e761107d611a6547473966315220a8451" exitCode=0 Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.548920 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" event={"ID":"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea","Type":"ContainerDied","Data":"a24e23c01e6fa34488e5a3b97f1c702e761107d611a6547473966315220a8451"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.550763 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" podStartSLOduration=121.550746679 podStartE2EDuration="2m1.550746679s" podCreationTimestamp="2026-01-22 14:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:33.550536703 +0000 UTC m=+142.252436906" watchObservedRunningTime="2026-01-22 14:06:33.550746679 +0000 UTC m=+142.252646862" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.551911 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" event={"ID":"9790065c-0263-4186-b043-a531f264b880","Type":"ContainerStarted","Data":"322497a3155fc411ec6bf72bf9a67ce698484ba4080bd90373acdfce67e8835c"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.554637 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" event={"ID":"d6346af0-261a-48ce-8a89-c7d0d287bdce","Type":"ContainerStarted","Data":"e54f7d30aef3d614c73f4caf378b654db3f82d7718079911921edf06321c8e46"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.557136 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" event={"ID":"0aad65ae-dbd7-48ef-a84e-8f60be1848c4","Type":"ContainerStarted","Data":"d9809f19824f10a957872e101e80e9c90f5a4041ea35ea4df3ca57734b247c0a"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.559877 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" event={"ID":"83688763-e6ec-4879-b214-03d2e00da08e","Type":"ContainerStarted","Data":"9d1fa72eee31c8dd6595c0d87373a1ee7d98b7bdb9f65a5ea01c20691203a485"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.563399 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" event={"ID":"7d2c98a4-f3a8-4200-8432-6f68459320ca","Type":"ContainerStarted","Data":"19715c9607ebc5693073fa43c0f262316a7afd480e40c3fecb7a5aa8e4acebb6"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.563850 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.566097 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" event={"ID":"44be3897-5e75-4a25-b586-79c33d07da2c","Type":"ContainerStarted","Data":"c5eb0e37a20589eb9f11798c5320e63676348b4f97b13e9fa3341c3927d6fddc"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.566622 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.568234 4801 patch_prober.go:28] interesting pod/console-operator-58897d9998-g5mt6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.568291 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" podUID="44be3897-5e75-4a25-b586-79c33d07da2c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.569154 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-57pdz" event={"ID":"9f30eedd-eee4-43f7-a1dd-4614580c7f0a","Type":"ContainerStarted","Data":"33024da0b1a9617027b315882cc86341e6c6bdf98782509014f1e1b6a98b1180"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.586511 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" podStartSLOduration=123.586490513 podStartE2EDuration="2m3.586490513s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:33.58289964 +0000 UTC m=+142.284799843" watchObservedRunningTime="2026-01-22 14:06:33.586490513 +0000 UTC m=+142.288390696" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.588579 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" event={"ID":"1ff335e7-91e9-438a-94f2-005a67ddfc2b","Type":"ContainerStarted","Data":"9aac7615b347f3f2ec880f5306715993643c7c318e7d11f7fbdf6a6744b58111"} Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.612170 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" podStartSLOduration=122.612146969 podStartE2EDuration="2m2.612146969s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:33.604391666 +0000 UTC m=+142.306291869" watchObservedRunningTime="2026-01-22 14:06:33.612146969 +0000 UTC m=+142.314047152" Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.625684 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.625719 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.125703437 +0000 UTC m=+142.827603620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.627720 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.630059 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.130040322 +0000 UTC m=+142.831940575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.729400 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.729546 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.229520823 +0000 UTC m=+142.931421006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.729811 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.730091 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.230079439 +0000 UTC m=+142.931979622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.831089 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.831324 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.331302291 +0000 UTC m=+143.033202474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.831697 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.831981 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.33197156 +0000 UTC m=+143.033871743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.932699 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.932830 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.432802181 +0000 UTC m=+143.134702364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:33 crc kubenswrapper[4801]: I0122 14:06:33.932921 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:33 crc kubenswrapper[4801]: E0122 14:06:33.933261 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.433250374 +0000 UTC m=+143.135150557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.020846 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.020890 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.021742 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.034251 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.034496 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.534468275 +0000 UTC m=+143.236368458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.034711 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.035172 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.535162515 +0000 UTC m=+143.237062698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.087182 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tzbl4"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.089415 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gw2wt"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.091775 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nbhfx"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.109002 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.114587 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rg8c5"] Jan 22 14:06:34 crc kubenswrapper[4801]: W0122 14:06:34.117530 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc477f6d9_e4b6_4a2d_bfd2_e54d5039ec94.slice/crio-bbf97dc8a8e0c027dd6b95968532c6ac2563ec0a54b7c3eccc41d23c0f30233d WatchSource:0}: Error finding container bbf97dc8a8e0c027dd6b95968532c6ac2563ec0a54b7c3eccc41d23c0f30233d: Status 404 returned error can't find the container with id bbf97dc8a8e0c027dd6b95968532c6ac2563ec0a54b7c3eccc41d23c0f30233d Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.125156 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.141958 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.142060 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.642039109 +0000 UTC m=+143.343939282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.142435 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.143141 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.64312559 +0000 UTC m=+143.345025773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: W0122 14:06:34.145289 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322bcacd_4b58_46cf_b37e_2ffda3f87b24.slice/crio-f15db0d7085387a77c1f7c148095c21662654af16b7d666c82779f5f8f496ba0 WatchSource:0}: Error finding container f15db0d7085387a77c1f7c148095c21662654af16b7d666c82779f5f8f496ba0: Status 404 returned error can't find the container with id f15db0d7085387a77c1f7c148095c21662654af16b7d666c82779f5f8f496ba0 Jan 22 14:06:34 crc kubenswrapper[4801]: W0122 14:06:34.158660 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba12591_7bc1_4df8_a9c0_d515689eefe5.slice/crio-b77c8918f74cdb07c530a7e2bfca504cfdda2ae85e2435ef5ca2f4f0e16c188c WatchSource:0}: Error finding container b77c8918f74cdb07c530a7e2bfca504cfdda2ae85e2435ef5ca2f4f0e16c188c: Status 404 returned error can't find the container with id b77c8918f74cdb07c530a7e2bfca504cfdda2ae85e2435ef5ca2f4f0e16c188c Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.247268 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.247707 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.747688477 +0000 UTC m=+143.449588660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.266891 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.350789 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.351187 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.851174164 +0000 UTC m=+143.553074337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.420680 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fpc52"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.428793 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-glbkg"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.459961 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcjmg"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.460388 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.460782 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:34.960767255 +0000 UTC m=+143.662667438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.462489 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gwkcf"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.470696 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.499038 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mg4vw"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.512820 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sb84b"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.527929 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.528013 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.545283 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.545369 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6ddc9"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.563129 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.563491 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.06347939 +0000 UTC m=+143.765379563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.568239 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.578315 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.578781 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-57pdz" event={"ID":"9f30eedd-eee4-43f7-a1dd-4614580c7f0a","Type":"ContainerStarted","Data":"7da3f29a797cd576f0c28e3cbd418465cd7b1d591d2538f1638a4a3cafa2d2fe"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.591659 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.596543 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.596591 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-879qk"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.608976 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.624344 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" event={"ID":"1ff335e7-91e9-438a-94f2-005a67ddfc2b","Type":"ContainerStarted","Data":"56312367b6265ce3ebd1c91e2310ad22c78a63ffcd6de45429906bb9cecb09b4"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.631847 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.634072 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.634339 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-57pdz" podStartSLOduration=123.634316041 podStartE2EDuration="2m3.634316041s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.605362531 +0000 UTC m=+143.307262724" watchObservedRunningTime="2026-01-22 14:06:34.634316041 +0000 UTC m=+143.336216254" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.657258 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.666010 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.666411 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.16639023 +0000 UTC m=+143.868290413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.676503 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6q4sx" event={"ID":"0965a214-4ab2-4160-9b4e-afea3ebf7f2f","Type":"ContainerStarted","Data":"c43eb72aaeda44e222f7783607aacf67cc85710466bd7fac692180c4b351ef3b"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.697129 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dd4gg" podStartSLOduration=123.697109991 podStartE2EDuration="2m3.697109991s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.662702864 +0000 UTC m=+143.364603047" watchObservedRunningTime="2026-01-22 14:06:34.697109991 +0000 UTC m=+143.399010174" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.700985 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v"] Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.708008 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" event={"ID":"d6346af0-261a-48ce-8a89-c7d0d287bdce","Type":"ContainerStarted","Data":"21a8237a8ca2027ca2fe3c31ed1b329b9564f2d11d5fc0ff7d3b9a3e37208183"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.721899 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6q4sx" podStartSLOduration=6.721877971 podStartE2EDuration="6.721877971s" podCreationTimestamp="2026-01-22 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.716378013 +0000 UTC m=+143.418278206" watchObservedRunningTime="2026-01-22 14:06:34.721877971 +0000 UTC m=+143.423778154" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.724909 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbhfx" event={"ID":"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94","Type":"ContainerStarted","Data":"bbf97dc8a8e0c027dd6b95968532c6ac2563ec0a54b7c3eccc41d23c0f30233d"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.746604 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttggq" podStartSLOduration=123.746577179 podStartE2EDuration="2m3.746577179s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.743302185 +0000 UTC m=+143.445202368" watchObservedRunningTime="2026-01-22 14:06:34.746577179 +0000 UTC m=+143.448477362" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.770273 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" event={"ID":"6ba12591-7bc1-4df8-a9c0-d515689eefe5","Type":"ContainerStarted","Data":"b77c8918f74cdb07c530a7e2bfca504cfdda2ae85e2435ef5ca2f4f0e16c188c"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.770913 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.771680 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.271663008 +0000 UTC m=+143.973563191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.782392 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" event={"ID":"1fa6610e-0cfe-4a02-be2d-014a2ad84215","Type":"ContainerStarted","Data":"aa98143cbd95c6cd2de704b59a1d706d96a906b23cd15eb342aa00bc070b66da"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.803247 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sx6df" event={"ID":"aa063689-071e-47b5-85d1-2bfe8d3b5fec","Type":"ContainerStarted","Data":"fb7d261cf17601c1e03cf346a2bd180b28fb37292e75c850dce0c033478e4254"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.804122 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.809822 4801 patch_prober.go:28] interesting pod/downloads-7954f5f757-sx6df container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.809879 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sx6df" podUID="aa063689-071e-47b5-85d1-2bfe8d3b5fec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.824608 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" event={"ID":"0aad65ae-dbd7-48ef-a84e-8f60be1848c4","Type":"ContainerStarted","Data":"67527a073b7da7fe79f9c8ebbe5ee7b593a6669b39dc0b267f182252c3d1ac1c"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.828328 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nbhfx" podStartSLOduration=123.828313162 podStartE2EDuration="2m3.828313162s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.777351961 +0000 UTC m=+143.479252144" watchObservedRunningTime="2026-01-22 14:06:34.828313162 +0000 UTC m=+143.530213345" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.829567 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s84ts" podStartSLOduration=124.829562558 podStartE2EDuration="2m4.829562558s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.828386764 +0000 UTC m=+143.530286947" watchObservedRunningTime="2026-01-22 14:06:34.829562558 +0000 UTC m=+143.531462741" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.833521 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" event={"ID":"28df97df-6dc9-4670-b14d-eae33c4de87d","Type":"ContainerStarted","Data":"e24acb694a1ddbb23ca5e788872abd4b45329a33788efc6ab617d6290d6f38ef"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.838932 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.843957 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:34 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:34 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:34 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.844009 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.853941 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-sx6df" podStartSLOduration=123.853922926 podStartE2EDuration="2m3.853922926s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.849036826 +0000 UTC m=+143.550937019" watchObservedRunningTime="2026-01-22 14:06:34.853922926 +0000 UTC m=+143.555823109" Jan 22 14:06:34 crc kubenswrapper[4801]: W0122 14:06:34.868153 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcd466a4_1fcd_47e6_9a9c_fae349c4806c.slice/crio-c1af188f1581c00f02af8f67a62780bbe34b61a34a7c263720cc36df863597fb WatchSource:0}: Error finding container c1af188f1581c00f02af8f67a62780bbe34b61a34a7c263720cc36df863597fb: Status 404 returned error can't find the container with id c1af188f1581c00f02af8f67a62780bbe34b61a34a7c263720cc36df863597fb Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.878671 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" event={"ID":"33406000-fb47-484f-a43f-80332a3d82b4","Type":"ContainerStarted","Data":"352af7f181f9a3be7fcb96ec9756132b42934eaf951615da2ec1db685ed7de8c"} Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.881925 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.381893758 +0000 UTC m=+144.083793941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.882147 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.882516 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.883613 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.383599617 +0000 UTC m=+144.085499800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.879282 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" event={"ID":"33406000-fb47-484f-a43f-80332a3d82b4","Type":"ContainerStarted","Data":"ed4bba3274dd656e812a4d55ae2f62df4ff7e8c6684c24fd0e771540fd51725a"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.885534 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xcfrl" podStartSLOduration=123.885509661 podStartE2EDuration="2m3.885509661s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.878189991 +0000 UTC m=+143.580090184" watchObservedRunningTime="2026-01-22 14:06:34.885509661 +0000 UTC m=+143.587409844" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.896469 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" event={"ID":"83688763-e6ec-4879-b214-03d2e00da08e","Type":"ContainerStarted","Data":"b84ecb61a3aa085bb4e29a313ab88b6c24d02ca5579db6d1264e4f0a427b44bb"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.902663 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" event={"ID":"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb","Type":"ContainerStarted","Data":"ad0bf71c33b163017357263796ece210a27b7cb8da881720bd94ef59df336e0d"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.909156 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" event={"ID":"801b8dae-b259-4026-816a-794d9bcc81a4","Type":"ContainerStarted","Data":"04b325802f292db6a36ea5052c8133040d1c7342205f0d319c7ddb82b11c58cc"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.911280 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" event={"ID":"322bcacd-4b58-46cf-b37e-2ffda3f87b24","Type":"ContainerStarted","Data":"f15db0d7085387a77c1f7c148095c21662654af16b7d666c82779f5f8f496ba0"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.919941 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" podStartSLOduration=123.919925158 podStartE2EDuration="2m3.919925158s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.917660133 +0000 UTC m=+143.619560336" watchObservedRunningTime="2026-01-22 14:06:34.919925158 +0000 UTC m=+143.621825341" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.925223 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gw2wt" event={"ID":"15a0fb0c-e901-4655-aa08-e52e944cea7c","Type":"ContainerStarted","Data":"0ee799d696a90f22481e6b4993220d9cd1c8bd668584a6bcfd464b0eea9e1ecc"} Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.953113 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-g5mt6" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.954250 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zvlsl" podStartSLOduration=123.954238822 podStartE2EDuration="2m3.954238822s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:34.953985814 +0000 UTC m=+143.655885997" watchObservedRunningTime="2026-01-22 14:06:34.954238822 +0000 UTC m=+143.656138995" Jan 22 14:06:34 crc kubenswrapper[4801]: I0122 14:06:34.984285 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:34 crc kubenswrapper[4801]: E0122 14:06:34.987599 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.487580677 +0000 UTC m=+144.189480860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.033019 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-845tw" podStartSLOduration=125.032994569 podStartE2EDuration="2m5.032994569s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:35.02812758 +0000 UTC m=+143.730027763" watchObservedRunningTime="2026-01-22 14:06:35.032994569 +0000 UTC m=+143.734894752" Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.062702 4801 csr.go:261] certificate signing request csr-7fq4b is approved, waiting to be issued Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.080642 4801 csr.go:257] certificate signing request csr-7fq4b is issued Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.086266 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.090846 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.590831177 +0000 UTC m=+144.292731360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.187755 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.187944 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.687919299 +0000 UTC m=+144.389819482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.188312 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.188746 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.688731442 +0000 UTC m=+144.390631625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.289215 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.289720 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.789704847 +0000 UTC m=+144.491605030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.399237 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.399809 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:35.899797513 +0000 UTC m=+144.601697696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.501424 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.502611 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.002579249 +0000 UTC m=+144.704479442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.630826 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.631301 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.131289529 +0000 UTC m=+144.833189712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.739145 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.739729 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.239715007 +0000 UTC m=+144.941615190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.840600 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.840981 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.34096986 +0000 UTC m=+145.042870043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.854533 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:35 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:35 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:35 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.854585 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.942799 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:35 crc kubenswrapper[4801]: E0122 14:06:35.943086 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.443072787 +0000 UTC m=+145.144972970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.945353 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" event={"ID":"708a3ae8-4d39-4cb3-b465-6da7854c28fd","Type":"ContainerStarted","Data":"465f64ee71d5e932f85db01e328376ace0170e9bb0123e1d56b00a09aec9f0b0"} Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.950384 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" event={"ID":"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b","Type":"ContainerStarted","Data":"67bd23c8a7f1a908bb4c9305282adba201c8af2394eb1284dd8de0372c15f106"} Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.950429 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" event={"ID":"19ca04fb-2b66-4dfb-b760-6cd54c1eef1b","Type":"ContainerStarted","Data":"7e81d0ea6f7d7803faade33ac7f5f6622f61798fe25ee5579d4fe9a98e2fd6d1"} Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.958232 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.973241 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" event={"ID":"322bcacd-4b58-46cf-b37e-2ffda3f87b24","Type":"ContainerStarted","Data":"00d77013b4b9a218cb60d56614f551fc5ff1956385423015944976da84746515"} Jan 22 14:06:35 crc kubenswrapper[4801]: I0122 14:06:35.995155 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" podStartSLOduration=124.995136499 podStartE2EDuration="2m4.995136499s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:35.99341977 +0000 UTC m=+144.695319973" watchObservedRunningTime="2026-01-22 14:06:35.995136499 +0000 UTC m=+144.697036682" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.002209 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" event={"ID":"5c251ddf-82e0-4cda-8303-ec6e65474e45","Type":"ContainerStarted","Data":"346e4b3482dd4363e8824eb8f436a08ad0516d16fb23ce7f796a840d1093fe52"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.002605 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" event={"ID":"5c251ddf-82e0-4cda-8303-ec6e65474e45","Type":"ContainerStarted","Data":"e873cfe62f8a7c636f59abffe7c4db00612190638affc3dccd38171e3e93f5c8"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.018240 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" event={"ID":"3c000bea-d0aa-4b12-941a-cf3b8db64b40","Type":"ContainerStarted","Data":"b1b16e263940eb6ea302a42493eef23213cc28c6935bb53ab83744992652be8a"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.026551 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" event={"ID":"28df97df-6dc9-4670-b14d-eae33c4de87d","Type":"ContainerStarted","Data":"6a4dd356fbdeac9ba0681465ed39a880071b9a8d0e61e5d7c3e57cc510d7ea0c"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.039307 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" event={"ID":"a59a6854-54d3-4b6a-a037-eb212513fa8f","Type":"ContainerStarted","Data":"d7d279b00262a79e262a342f5d4dddeba88b911e43a2907b5b70e1048b1c5de1"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.039366 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" event={"ID":"a59a6854-54d3-4b6a-a037-eb212513fa8f","Type":"ContainerStarted","Data":"0e6267a69d72c59e0c5843e8b3febe54bf923d795919a8e8f42a0b6ada755167"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.039619 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sb84b" podStartSLOduration=124.039597634 podStartE2EDuration="2m4.039597634s" podCreationTimestamp="2026-01-22 14:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.0387684 +0000 UTC m=+144.740668593" watchObservedRunningTime="2026-01-22 14:06:36.039597634 +0000 UTC m=+144.741497817" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.044363 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.047027 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" event={"ID":"f0d5b3cc-c956-48f1-8fd8-66e815f6ae05","Type":"ContainerStarted","Data":"0831159ac6bfed2d948b2c036f3d0e5e91996341876edf4912341c03cd7af6af"} Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.047105 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.547088208 +0000 UTC m=+145.248988461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.053256 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gw2wt" event={"ID":"15a0fb0c-e901-4655-aa08-e52e944cea7c","Type":"ContainerStarted","Data":"7803c710b0aa95e52c18c495e0859c2fba7ac1c531424a193970462fa74d34cc"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.068150 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nsgnh" podStartSLOduration=125.068134702 podStartE2EDuration="2m5.068134702s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.066760992 +0000 UTC m=+144.768661195" watchObservedRunningTime="2026-01-22 14:06:36.068134702 +0000 UTC m=+144.770034885" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.075233 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" event={"ID":"eb7d48ad-03ae-4594-8c04-d9afa9dc453c","Type":"ContainerStarted","Data":"64cb6b406c0bada80ca9e5c0efacd88b36c3c783fa57c986a33f30d9d424c0b2"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.082211 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-22 14:01:35 +0000 UTC, rotation deadline is 2026-12-15 14:01:53.168841553 +0000 UTC Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.082240 4801 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7847h55m17.086603667s for next certificate rotation Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.087142 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" event={"ID":"801b8dae-b259-4026-816a-794d9bcc81a4","Type":"ContainerStarted","Data":"7c88663fee5d8925f11cfa4341ae4e519830beee4b48634fc8d594739f36432b"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.130515 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbhfx" event={"ID":"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94","Type":"ContainerStarted","Data":"747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.153249 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" podStartSLOduration=126.153230041 podStartE2EDuration="2m6.153230041s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.109548839 +0000 UTC m=+144.811449032" watchObservedRunningTime="2026-01-22 14:06:36.153230041 +0000 UTC m=+144.855130224" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.155032 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.155148 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.655131255 +0000 UTC m=+145.357031438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.155267 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.157638 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.657620537 +0000 UTC m=+145.359520810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.164440 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" event={"ID":"32fa36ec-07ad-4ebe-8e79-3692f621cd37","Type":"ContainerStarted","Data":"3515190969f39c2495f6c9f06f4240129ab697aa8e14235afdf461c125e17b62"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.174567 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" event={"ID":"6ba12591-7bc1-4df8-a9c0-d515689eefe5","Type":"ContainerStarted","Data":"d55596ba9a9d7e96dc2f874d4cb1d3356a87b02bdb79dde184023002ba3b4cc2"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.175936 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-879qk" event={"ID":"bcd466a4-1fcd-47e6-9a9c-fae349c4806c","Type":"ContainerStarted","Data":"c1af188f1581c00f02af8f67a62780bbe34b61a34a7c263720cc36df863597fb"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.179177 4801 generic.go:334] "Generic (PLEG): container finished" podID="b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb" containerID="058a44cf23f1bc09031a5a7dc7ba7fa53b56af326f3123a3c833276f9a307d2a" exitCode=0 Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.179267 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" event={"ID":"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb","Type":"ContainerDied","Data":"058a44cf23f1bc09031a5a7dc7ba7fa53b56af326f3123a3c833276f9a307d2a"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.188608 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" event={"ID":"fe863e9d-682c-4c71-b6b5-901ced4fcf87","Type":"ContainerStarted","Data":"a54ccd425b4953635c9db28dd853fbe8a68c775ee9486b2a75933835fac0ecd8"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.201368 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-glbkg" podStartSLOduration=126.20134943 podStartE2EDuration="2m6.20134943s" podCreationTimestamp="2026-01-22 14:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.154730694 +0000 UTC m=+144.856630877" watchObservedRunningTime="2026-01-22 14:06:36.20134943 +0000 UTC m=+144.903249613" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.206699 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" event={"ID":"d15f1199-bd10-4d84-a488-ffbbe320209a","Type":"ContainerStarted","Data":"7f68caeb7ea4ddf82260ed3f4897ef2a834125c6c3553d6ea81d203fa6dc1236"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.206736 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" event={"ID":"d15f1199-bd10-4d84-a488-ffbbe320209a","Type":"ContainerStarted","Data":"611fe261c2510714acee442ac8d93871668ecc88a46a8e2473381fcb2e49ea27"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.214288 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" event={"ID":"57c1fbaf-6477-491e-8a43-9f780293b8ae","Type":"ContainerStarted","Data":"ee0f773a72179adec06e7d040dc29b84e39fe05a2f7f4378b40081f5ea75592c"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.234018 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" event={"ID":"401b3157-b40f-489c-827f-d4f941e96001","Type":"ContainerStarted","Data":"9691f1f23d44d28dcf6f1a49b91033239fe920cb5b4614c27ad9c936772b5e5e"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.234068 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" event={"ID":"401b3157-b40f-489c-827f-d4f941e96001","Type":"ContainerStarted","Data":"61ae00462826d359b1f89a1db0508f68673f85c1b2401a2b4118c072525356b2"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.234354 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.246437 4801 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mg4vw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.246511 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.256240 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.257425 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.757410717 +0000 UTC m=+145.459310900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.267130 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" event={"ID":"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9","Type":"ContainerStarted","Data":"e5c51aa1b92eaf50e53403ec6815b1d87a9e1c98671c10ca0252b5398e35df53"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.267997 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.268915 4801 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-nrqhg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.268953 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" podUID="a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.275608 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" podStartSLOduration=125.275591979 podStartE2EDuration="2m5.275591979s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.275160706 +0000 UTC m=+144.977060889" watchObservedRunningTime="2026-01-22 14:06:36.275591979 +0000 UTC m=+144.977492172" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.275695 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mhnvv" podStartSLOduration=125.275690361 podStartE2EDuration="2m5.275690361s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.229202139 +0000 UTC m=+144.931102332" watchObservedRunningTime="2026-01-22 14:06:36.275690361 +0000 UTC m=+144.977590554" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.291299 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-glntr" event={"ID":"83688763-e6ec-4879-b214-03d2e00da08e","Type":"ContainerStarted","Data":"2456bd701b67e86fce77cab7c6baec577bf0ee8443864347d9068eb5719aed4f"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.298768 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" event={"ID":"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c","Type":"ContainerStarted","Data":"03db6c8a9b25ddd8ba14191c7b1986a4f5a71ce1374b5332ca42c9402c841f81"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.309669 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" podStartSLOduration=125.309652425 podStartE2EDuration="2m5.309652425s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.308009658 +0000 UTC m=+145.009909841" watchObservedRunningTime="2026-01-22 14:06:36.309652425 +0000 UTC m=+145.011552618" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.322057 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" event={"ID":"c319403b-3764-4420-8e53-06fb93e21a23","Type":"ContainerStarted","Data":"8a888ef43748052eeadf93587f0545dd1cc0ba264ccd9eb443e212bac575ca3c"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.338144 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" event={"ID":"8408fe11-4e94-4069-89bf-9370719c1770","Type":"ContainerStarted","Data":"6ed3912c49b09c7df8da83b0c1ffe225fcfe94e7961186b14122b5220c147653"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.338192 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" event={"ID":"8408fe11-4e94-4069-89bf-9370719c1770","Type":"ContainerStarted","Data":"c6f4ffab29ec516fcd9172f3237e6a3e358c937193c34bc252e5c96a72fbcd85"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.338844 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.353748 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gwkcf" event={"ID":"b156efe9-7dba-4006-bbc0-49cd04c42d32","Type":"ContainerStarted","Data":"1a3b4a40524a681475b3245c5b4b3ed1723d525bf5569c8be72a648c3178416e"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.353804 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gwkcf" event={"ID":"b156efe9-7dba-4006-bbc0-49cd04c42d32","Type":"ContainerStarted","Data":"62b344a62101c9b07a6fed5ce9b3b6f47042aee0a5b30ce41ce317e0521e126b"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.354554 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" podStartSLOduration=125.354536432 podStartE2EDuration="2m5.354536432s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.353819721 +0000 UTC m=+145.055719914" watchObservedRunningTime="2026-01-22 14:06:36.354536432 +0000 UTC m=+145.056436615" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.358493 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.360745 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.860727499 +0000 UTC m=+145.562627682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.366236 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" event={"ID":"145bd263-96be-4598-ad6c-3c9a73b92d26","Type":"ContainerStarted","Data":"bc556b99c41b6d05d244061806b6a8242cc6be2c4149c9df26d0ff5cd06b6a03"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.366287 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" event={"ID":"145bd263-96be-4598-ad6c-3c9a73b92d26","Type":"ContainerStarted","Data":"96c459ef1cfd8051623f860d5589322e138ac9dbd6a18216ad695cf413a78e7e"} Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.372662 4801 patch_prober.go:28] interesting pod/downloads-7954f5f757-sx6df container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.372748 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sx6df" podUID="aa063689-071e-47b5-85d1-2bfe8d3b5fec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.378365 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" podStartSLOduration=125.378345224 podStartE2EDuration="2m5.378345224s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.377224792 +0000 UTC m=+145.079124985" watchObservedRunningTime="2026-01-22 14:06:36.378345224 +0000 UTC m=+145.080245407" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.420502 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gwkcf" podStartSLOduration=8.420482232 podStartE2EDuration="8.420482232s" podCreationTimestamp="2026-01-22 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.40751218 +0000 UTC m=+145.109412363" watchObservedRunningTime="2026-01-22 14:06:36.420482232 +0000 UTC m=+145.122382415" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.462177 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.464426 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:36.964398361 +0000 UTC m=+145.666298554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.564976 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.565498 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.065476998 +0000 UTC m=+145.767377181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.666768 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.666979 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.166957738 +0000 UTC m=+145.868857921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.667072 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.667429 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.167417481 +0000 UTC m=+145.869317664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.767680 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.767961 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.267931372 +0000 UTC m=+145.969831575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.768079 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.768458 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.268428196 +0000 UTC m=+145.970328379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.848284 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:36 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:36 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:36 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.848637 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.869314 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.869691 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.369664398 +0000 UTC m=+146.071564641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.958956 4801 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sfjhw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.959019 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" podUID="19ca04fb-2b66-4dfb-b760-6cd54c1eef1b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 14:06:36 crc kubenswrapper[4801]: I0122 14:06:36.971372 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:36 crc kubenswrapper[4801]: E0122 14:06:36.971686 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.471672893 +0000 UTC m=+146.173573076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.073673 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.073966 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.573950994 +0000 UTC m=+146.275851177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.175651 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.176075 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.676055131 +0000 UTC m=+146.377955314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.277963 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.278148 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.778121657 +0000 UTC m=+146.480021840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.278366 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.278751 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.778736755 +0000 UTC m=+146.480637008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.375549 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-879qk" event={"ID":"bcd466a4-1fcd-47e6-9a9c-fae349c4806c","Type":"ContainerStarted","Data":"9bff2f325957759901977c0c1d4803c2b46eea797fa13fa2891a8e4605d41f5a"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.377977 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bpwfq" event={"ID":"c319403b-3764-4420-8e53-06fb93e21a23","Type":"ContainerStarted","Data":"8be831f202dfb0efdfb8332ad930a766a98f9b397d3f3b43c80011a27fcacde6"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.378894 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.379026 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.879008379 +0000 UTC m=+146.580908562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.379100 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.379412 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.87939936 +0000 UTC m=+146.581299543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.380712 4801 generic.go:334] "Generic (PLEG): container finished" podID="eb7d48ad-03ae-4594-8c04-d9afa9dc453c" containerID="5f9e3ba2bdbc42bd5a578b48e104bdb97890148e44c0ebbeaff285d98944f315" exitCode=0 Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.380763 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" event={"ID":"eb7d48ad-03ae-4594-8c04-d9afa9dc453c","Type":"ContainerDied","Data":"5f9e3ba2bdbc42bd5a578b48e104bdb97890148e44c0ebbeaff285d98944f315"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.389892 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gw2wt" event={"ID":"15a0fb0c-e901-4655-aa08-e52e944cea7c","Type":"ContainerStarted","Data":"25a856256f492a9e91d51cbc06a15d5b9986602b47989221bf02539314223517"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.390042 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.414843 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" event={"ID":"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c","Type":"ContainerStarted","Data":"7514d53c3aec189042c92e75a41fc0f218374f9dc945cd6f514555420cfa87eb"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.414897 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" event={"ID":"53ee297f-b387-48f0-ae2f-4eb9bb7dae2c","Type":"ContainerStarted","Data":"98356d2df9cd4d89f3840be6189961a0a2380d22ad3cb8b30d01c1ad3fe7ad99"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.437495 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" event={"ID":"c80af8d9-7c2c-4a4e-bd88-a3845f20b9ea","Type":"ContainerStarted","Data":"311c4285d3bb94d108d31763e0f23019931b2edf62636046bf26d0d9b7f9d7a7"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.439591 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" event={"ID":"6ba12591-7bc1-4df8-a9c0-d515689eefe5","Type":"ContainerStarted","Data":"0742bfcf349c7e6d87a61f6edc3ab0ed53e31c79d0ae9bbc575c52e7b492ac88"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.441550 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" event={"ID":"8408fe11-4e94-4069-89bf-9370719c1770","Type":"ContainerStarted","Data":"1086afe391f1334df4ab5d3e78daba6e996ab09b1bcfb3c63fb8c84b77097e7f"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.443634 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" event={"ID":"b4a7b96d-3d2e-4a3b-b9a5-ce286a0daefb","Type":"ContainerStarted","Data":"74f18f9256dbb5f4bc8c4e45e261ddab711d85ad22b782e1ddf6df0e360df57f"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.444047 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.445373 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" event={"ID":"f0d5b3cc-c956-48f1-8fd8-66e815f6ae05","Type":"ContainerStarted","Data":"324955656ce5bf58cb835f2059d5ca00758255bfe65cfeda975795cd96eee10c"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.445405 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" event={"ID":"f0d5b3cc-c956-48f1-8fd8-66e815f6ae05","Type":"ContainerStarted","Data":"544ca057ed1d5d49eb58ad512020f464930d9a6352e71456bb1c65d6d975d127"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.447043 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" event={"ID":"322bcacd-4b58-46cf-b37e-2ffda3f87b24","Type":"ContainerStarted","Data":"098a75565cf4f0d00b91019c3a524a3dbb03161dd205e5dd9f26b905d3b22f30"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.466411 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l8prd" podStartSLOduration=126.466391974 podStartE2EDuration="2m6.466391974s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:36.460278823 +0000 UTC m=+145.162179006" watchObservedRunningTime="2026-01-22 14:06:37.466391974 +0000 UTC m=+146.168292157" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.474427 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" event={"ID":"57c1fbaf-6477-491e-8a43-9f780293b8ae","Type":"ContainerStarted","Data":"6e4d8479c6a350b95402dba3afe02cda6de8a4758acbabf24d6f486d653ba53c"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.475380 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.480621 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.481832 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:37.981809136 +0000 UTC m=+146.683709349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.496982 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.504623 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" event={"ID":"32fa36ec-07ad-4ebe-8e79-3692f621cd37","Type":"ContainerStarted","Data":"956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.505566 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.518019 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" event={"ID":"708a3ae8-4d39-4cb3-b465-6da7854c28fd","Type":"ContainerStarted","Data":"ecccabf87afefb271014b55e1eb6df1b767ef8216dd5e431dc2a98ecbe5f327b"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.518094 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" event={"ID":"708a3ae8-4d39-4cb3-b465-6da7854c28fd","Type":"ContainerStarted","Data":"3b9a6baed182ac858caf86f814ee831af0660c534897ee7afc00e528b8f6be0b"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.521054 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" event={"ID":"fe863e9d-682c-4c71-b6b5-901ced4fcf87","Type":"ContainerStarted","Data":"5c34e053cab79e265be34163bbf79d4252d3abc9e0e1e1b813897c66fd6d49de"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.526711 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.536092 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" event={"ID":"a3ae8ad9-c93e-4d71-9c4b-0a65f358edf9","Type":"ContainerStarted","Data":"b9cb7371f0e6be3a74d53ef78d4f9e53b508e68ec5ae7ec18cc34a05215ee986"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.543329 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tzbl4" podStartSLOduration=126.543310049 podStartE2EDuration="2m6.543310049s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.496730114 +0000 UTC m=+146.198630297" watchObservedRunningTime="2026-01-22 14:06:37.543310049 +0000 UTC m=+146.245210232" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.544143 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kklp6" podStartSLOduration=126.544135353 podStartE2EDuration="2m6.544135353s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.542417984 +0000 UTC m=+146.244318167" watchObservedRunningTime="2026-01-22 14:06:37.544135353 +0000 UTC m=+146.246035536" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.555322 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nrqhg" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.582090 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.583549 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.083531222 +0000 UTC m=+146.785431495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.590604 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gw2wt" podStartSLOduration=9.590585434 podStartE2EDuration="9.590585434s" podCreationTimestamp="2026-01-22 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.586185688 +0000 UTC m=+146.288085891" watchObservedRunningTime="2026-01-22 14:06:37.590585434 +0000 UTC m=+146.292485617" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.611677 4801 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mg4vw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.612209 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.612121 4801 patch_prober.go:28] interesting pod/downloads-7954f5f757-sx6df container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.612279 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sx6df" podUID="aa063689-071e-47b5-85d1-2bfe8d3b5fec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.612382 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" event={"ID":"3c000bea-d0aa-4b12-941a-cf3b8db64b40","Type":"ContainerStarted","Data":"65563016fff2f4b76b555995bf33d4fc52b42c8c8e92166f3c32655162a7357c"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.612413 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" event={"ID":"33406000-fb47-484f-a43f-80332a3d82b4","Type":"ContainerStarted","Data":"ec4f99270542b4a2c33352947c86a0d01c3e1693bbf173ef2c57edd0c54e542a"} Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.642715 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfjhw" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.670885 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rg8c5" podStartSLOduration=126.670865366 podStartE2EDuration="2m6.670865366s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.668159028 +0000 UTC m=+146.370059211" watchObservedRunningTime="2026-01-22 14:06:37.670865366 +0000 UTC m=+146.372765549" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.687102 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.689142 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.189128709 +0000 UTC m=+146.891028892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.767342 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" podStartSLOduration=126.767326291 podStartE2EDuration="2m6.767326291s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.733777009 +0000 UTC m=+146.435677192" watchObservedRunningTime="2026-01-22 14:06:37.767326291 +0000 UTC m=+146.469226474" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.791179 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.791579 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.291567006 +0000 UTC m=+146.993467189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.819837 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wl6g5" podStartSLOduration=126.819822986 podStartE2EDuration="2m6.819822986s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.767696051 +0000 UTC m=+146.469596234" watchObservedRunningTime="2026-01-22 14:06:37.819822986 +0000 UTC m=+146.521723169" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.820460 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" podStartSLOduration=126.820443854 podStartE2EDuration="2m6.820443854s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.818848798 +0000 UTC m=+146.520748981" watchObservedRunningTime="2026-01-22 14:06:37.820443854 +0000 UTC m=+146.522344027" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.852045 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:37 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:37 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:37 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.852109 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.893061 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.893384 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.393368924 +0000 UTC m=+147.095269107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.897661 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d9f9k" podStartSLOduration=126.897641277 podStartE2EDuration="2m6.897641277s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.854730736 +0000 UTC m=+146.556630929" watchObservedRunningTime="2026-01-22 14:06:37.897641277 +0000 UTC m=+146.599541460" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.951793 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-q2f25" podStartSLOduration=126.951776888 podStartE2EDuration="2m6.951776888s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.900152248 +0000 UTC m=+146.602052431" watchObservedRunningTime="2026-01-22 14:06:37.951776888 +0000 UTC m=+146.653677071" Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.995388 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:37 crc kubenswrapper[4801]: E0122 14:06:37.995750 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.495735848 +0000 UTC m=+147.197636031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:37 crc kubenswrapper[4801]: I0122 14:06:37.996148 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-j5l5v" podStartSLOduration=125.99613767 podStartE2EDuration="2m5.99613767s" podCreationTimestamp="2026-01-22 14:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.953639502 +0000 UTC m=+146.655539685" watchObservedRunningTime="2026-01-22 14:06:37.99613767 +0000 UTC m=+146.698037853" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.037341 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nqdqv" podStartSLOduration=127.037326871 podStartE2EDuration="2m7.037326871s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:37.999350572 +0000 UTC m=+146.701250765" watchObservedRunningTime="2026-01-22 14:06:38.037326871 +0000 UTC m=+146.739227054" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.037823 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-thrzc" podStartSLOduration=127.037820265 podStartE2EDuration="2m7.037820265s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:38.035761186 +0000 UTC m=+146.737661369" watchObservedRunningTime="2026-01-22 14:06:38.037820265 +0000 UTC m=+146.739720448" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.081062 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" podStartSLOduration=127.081044194 podStartE2EDuration="2m7.081044194s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:38.076528524 +0000 UTC m=+146.778428727" watchObservedRunningTime="2026-01-22 14:06:38.081044194 +0000 UTC m=+146.782944377" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.096667 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.096964 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.596946049 +0000 UTC m=+147.298846242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.197935 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.198198 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.698186182 +0000 UTC m=+147.400086365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.299286 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.299442 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.799420114 +0000 UTC m=+147.501320297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.299636 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.299974 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.799961139 +0000 UTC m=+147.501861322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.400595 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.400781 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.900737598 +0000 UTC m=+147.602637781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.400884 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.401282 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:38.901271543 +0000 UTC m=+147.603171796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.505095 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.505285 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.005258244 +0000 UTC m=+147.707158417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.505364 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.505712 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.005696027 +0000 UTC m=+147.707596220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.552897 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-drqnv"] Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.553962 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.556533 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.570666 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drqnv"] Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.606391 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.606616 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.106584749 +0000 UTC m=+147.808484942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.606864 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-catalog-content\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.606953 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-utilities\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.606996 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn8tl\" (UniqueName: \"kubernetes.io/projected/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-kube-api-access-dn8tl\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.607066 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.607360 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.107351661 +0000 UTC m=+147.809251844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.617318 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" event={"ID":"eb7d48ad-03ae-4594-8c04-d9afa9dc453c","Type":"ContainerStarted","Data":"bcf9ed3fdeb850bdae38bf038b58aa9e40b3d6e5dedc3feafaa5fb757230e507"} Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.619654 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-879qk" event={"ID":"bcd466a4-1fcd-47e6-9a9c-fae349c4806c","Type":"ContainerStarted","Data":"60caaa42b748486bb2ce0ac2db8b00ea9de9512191a218287362e76133a6e80c"} Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.619708 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-879qk" event={"ID":"bcd466a4-1fcd-47e6-9a9c-fae349c4806c","Type":"ContainerStarted","Data":"5c5db05cf95896c885a24f4a8546cec62e1cef264ddbab78b12b14df76f35fb2"} Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.633701 4801 generic.go:334] "Generic (PLEG): container finished" podID="a59a6854-54d3-4b6a-a037-eb212513fa8f" containerID="d7d279b00262a79e262a342f5d4dddeba88b911e43a2907b5b70e1048b1c5de1" exitCode=0 Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.634390 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" event={"ID":"a59a6854-54d3-4b6a-a037-eb212513fa8f","Type":"ContainerDied","Data":"d7d279b00262a79e262a342f5d4dddeba88b911e43a2907b5b70e1048b1c5de1"} Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.695172 4801 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.708954 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.709433 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-catalog-content\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.709648 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-utilities\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.709711 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn8tl\" (UniqueName: \"kubernetes.io/projected/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-kube-api-access-dn8tl\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.711480 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.211459904 +0000 UTC m=+147.913360087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.712918 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-utilities\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.716308 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-catalog-content\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.763288 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn8tl\" (UniqueName: \"kubernetes.io/projected/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-kube-api-access-dn8tl\") pod \"certified-operators-drqnv\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.769381 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-87tcn"] Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.770276 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.774771 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.787714 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87tcn"] Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.814099 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.814180 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-utilities\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.814235 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxt5r\" (UniqueName: \"kubernetes.io/projected/5544e90b-4784-48bb-9462-5562be0bc923-kube-api-access-cxt5r\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.814257 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-catalog-content\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.814564 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.31455147 +0000 UTC m=+148.016451653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.846234 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:38 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:38 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:38 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.846295 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.875844 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.919109 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.919321 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxt5r\" (UniqueName: \"kubernetes.io/projected/5544e90b-4784-48bb-9462-5562be0bc923-kube-api-access-cxt5r\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.919347 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-catalog-content\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.919423 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-utilities\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.919809 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-utilities\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: E0122 14:06:38.919881 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.419866708 +0000 UTC m=+148.121766892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.920290 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-catalog-content\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.952366 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxt5r\" (UniqueName: \"kubernetes.io/projected/5544e90b-4784-48bb-9462-5562be0bc923-kube-api-access-cxt5r\") pod \"community-operators-87tcn\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.963272 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9tg8"] Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.964380 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:38 crc kubenswrapper[4801]: I0122 14:06:38.981644 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9tg8"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.020340 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.020550 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-catalog-content\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: E0122 14:06:39.020577 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.520565655 +0000 UTC m=+148.222465838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.020673 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-utilities\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.020748 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzk6v\" (UniqueName: \"kubernetes.io/projected/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-kube-api-access-zzk6v\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.107342 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.125023 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.125291 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-catalog-content\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.125347 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-utilities\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.125376 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzk6v\" (UniqueName: \"kubernetes.io/projected/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-kube-api-access-zzk6v\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: E0122 14:06:39.125697 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.625683478 +0000 UTC m=+148.327583661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.126020 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-catalog-content\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.126220 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-utilities\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.134251 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w7hkb"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.135165 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.150618 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7hkb"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.153821 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzk6v\" (UniqueName: \"kubernetes.io/projected/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-kube-api-access-zzk6v\") pod \"certified-operators-w9tg8\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.233151 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-catalog-content\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.233202 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-utilities\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.233241 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phqnx\" (UniqueName: \"kubernetes.io/projected/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-kube-api-access-phqnx\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.233313 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:39 crc kubenswrapper[4801]: E0122 14:06:39.233682 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.733665564 +0000 UTC m=+148.435565747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zckqh" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.287739 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.334189 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.334466 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.334537 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.334570 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-catalog-content\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.334592 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-utilities\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.334631 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phqnx\" (UniqueName: \"kubernetes.io/projected/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-kube-api-access-phqnx\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.335298 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-catalog-content\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.335347 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-utilities\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: E0122 14:06:39.335415 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 14:06:39.83538999 +0000 UTC m=+148.537290173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.338399 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.339203 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.361726 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phqnx\" (UniqueName: \"kubernetes.io/projected/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-kube-api-access-phqnx\") pod \"community-operators-w7hkb\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.370816 4801 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-22T14:06:38.695202658Z","Handler":null,"Name":""} Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.375176 4801 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.375227 4801 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.436637 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.483007 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.490266 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.498797 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.506301 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87tcn"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.549695 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.549838 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.557962 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.596716 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drqnv"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.649497 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9tg8"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.692327 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerStarted","Data":"eb1687fd83a584634befbf1599e688969001e3730f71dd393b2d68431c2db316"} Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.701436 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drqnv" event={"ID":"ca3c0e1d-7cb5-400b-8c40-10fe54d57224","Type":"ContainerStarted","Data":"1fd61157a6df2aebd941bfae9f23f2ce6b5fc6d789e4fabf14c2ea8222c0bf92"} Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.707338 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fpc52" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.828435 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.835922 4801 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.835986 4801 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:39 crc kubenswrapper[4801]: W0122 14:06:39.856656 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181b9b48_fc92_4b6d_b7e7_a61fb42b2024.slice/crio-e468fd37eb3ad679dddccfc42ba3d7bdb7c586df0ad3f2e2a2cc38ef92855d1e WatchSource:0}: Error finding container e468fd37eb3ad679dddccfc42ba3d7bdb7c586df0ad3f2e2a2cc38ef92855d1e: Status 404 returned error can't find the container with id e468fd37eb3ad679dddccfc42ba3d7bdb7c586df0ad3f2e2a2cc38ef92855d1e Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.863269 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:39 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:39 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:39 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.863368 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.934366 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7hkb"] Jan 22 14:06:39 crc kubenswrapper[4801]: I0122 14:06:39.990169 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zckqh\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.067442 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.090827 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.188891 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.295190 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.313300 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.345924 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.346002 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.350955 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.370408 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prpr\" (UniqueName: \"kubernetes.io/projected/a59a6854-54d3-4b6a-a037-eb212513fa8f-kube-api-access-7prpr\") pod \"a59a6854-54d3-4b6a-a037-eb212513fa8f\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.370494 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a59a6854-54d3-4b6a-a037-eb212513fa8f-secret-volume\") pod \"a59a6854-54d3-4b6a-a037-eb212513fa8f\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.370566 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a59a6854-54d3-4b6a-a037-eb212513fa8f-config-volume\") pod \"a59a6854-54d3-4b6a-a037-eb212513fa8f\" (UID: \"a59a6854-54d3-4b6a-a037-eb212513fa8f\") " Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.379550 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59a6854-54d3-4b6a-a037-eb212513fa8f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a59a6854-54d3-4b6a-a037-eb212513fa8f" (UID: "a59a6854-54d3-4b6a-a037-eb212513fa8f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.382127 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59a6854-54d3-4b6a-a037-eb212513fa8f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a59a6854-54d3-4b6a-a037-eb212513fa8f" (UID: "a59a6854-54d3-4b6a-a037-eb212513fa8f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.382370 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59a6854-54d3-4b6a-a037-eb212513fa8f-kube-api-access-7prpr" (OuterVolumeSpecName: "kube-api-access-7prpr") pod "a59a6854-54d3-4b6a-a037-eb212513fa8f" (UID: "a59a6854-54d3-4b6a-a037-eb212513fa8f"). InnerVolumeSpecName "kube-api-access-7prpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.400142 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 14:06:40 crc kubenswrapper[4801]: E0122 14:06:40.400346 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59a6854-54d3-4b6a-a037-eb212513fa8f" containerName="collect-profiles" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.400357 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59a6854-54d3-4b6a-a037-eb212513fa8f" containerName="collect-profiles" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.405121 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59a6854-54d3-4b6a-a037-eb212513fa8f" containerName="collect-profiles" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.405488 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.414176 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.414460 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.472072 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.472163 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.472281 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prpr\" (UniqueName: \"kubernetes.io/projected/a59a6854-54d3-4b6a-a037-eb212513fa8f-kube-api-access-7prpr\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.472293 4801 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a59a6854-54d3-4b6a-a037-eb212513fa8f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.472303 4801 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a59a6854-54d3-4b6a-a037-eb212513fa8f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.504362 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.557302 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nkd9x"] Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.562744 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.565242 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.570258 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkd9x"] Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.575319 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.575423 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.575505 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.612168 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.677773 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-catalog-content\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.677848 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-utilities\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.677868 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lkw9\" (UniqueName: \"kubernetes.io/projected/134f3572-e2f4-4817-93c5-e93c68b761de-kube-api-access-2lkw9\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.708616 4801 patch_prober.go:28] interesting pod/downloads-7954f5f757-sx6df container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.708669 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sx6df" podUID="aa063689-071e-47b5-85d1-2bfe8d3b5fec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.708621 4801 patch_prober.go:28] interesting pod/downloads-7954f5f757-sx6df container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.708739 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sx6df" podUID="aa063689-071e-47b5-85d1-2bfe8d3b5fec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.731081 4801 generic.go:334] "Generic (PLEG): container finished" podID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerID="d69cddb9c1d538a73b06f83799c239abe4357c9409a9e49f43a3bf98979663a5" exitCode=0 Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.732944 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7hkb" event={"ID":"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c","Type":"ContainerDied","Data":"d69cddb9c1d538a73b06f83799c239abe4357c9409a9e49f43a3bf98979663a5"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.742523 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.742575 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7hkb" event={"ID":"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c","Type":"ContainerStarted","Data":"95917f7645ca3103073b1fedc299b2beab905eddfb6926dae3668e01f4473e2f"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.764784 4801 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.769441 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.769539 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.775374 4801 patch_prober.go:28] interesting pod/console-f9d7485db-nbhfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.775442 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nbhfx" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.780091 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-catalog-content\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.780583 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-catalog-content\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.780683 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-utilities\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.781020 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkw9\" (UniqueName: \"kubernetes.io/projected/134f3572-e2f4-4817-93c5-e93c68b761de-kube-api-access-2lkw9\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.780898 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-utilities\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.786477 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8645c02dda4a5f1c5a11bd1909b3655d6d55a6bc8d311edda7af0c644fcc4df0"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.786529 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c171c74c0399626479e9690653a609c4b7e3ccd92d49bdd1edbadd3841df7ea5"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.812081 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" event={"ID":"a59a6854-54d3-4b6a-a037-eb212513fa8f","Type":"ContainerDied","Data":"0e6267a69d72c59e0c5843e8b3febe54bf923d795919a8e8f42a0b6ada755167"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.812127 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6267a69d72c59e0c5843e8b3febe54bf923d795919a8e8f42a0b6ada755167" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.812241 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484840-xzxsv" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.844599 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.866506 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lkw9\" (UniqueName: \"kubernetes.io/projected/134f3572-e2f4-4817-93c5-e93c68b761de-kube-api-access-2lkw9\") pod \"redhat-marketplace-nkd9x\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.873997 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8c91f7126a24e3e03da27d71fd5b4e721820d450a11f6b064ab45c6add09c392"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.874039 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cc7837330d89062979ffe308b255c25ac791868891818cbba3f10ac7504e6abb"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.874733 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.875663 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:40 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:40 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:40 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.875698 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.887888 4801 generic.go:334] "Generic (PLEG): container finished" podID="5544e90b-4784-48bb-9462-5562be0bc923" containerID="053a852405672299475bd86f32a7f7a203f6a48466b41949a2ceec1b20def896" exitCode=0 Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.887979 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerDied","Data":"053a852405672299475bd86f32a7f7a203f6a48466b41949a2ceec1b20def896"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.890688 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zckqh"] Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.907803 4801 generic.go:334] "Generic (PLEG): container finished" podID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerID="6570fea44558fe8818439a4a2416e7716172535155c2e9a8025555188a889346" exitCode=0 Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.907885 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerDied","Data":"6570fea44558fe8818439a4a2416e7716172535155c2e9a8025555188a889346"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.907911 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerStarted","Data":"e468fd37eb3ad679dddccfc42ba3d7bdb7c586df0ad3f2e2a2cc38ef92855d1e"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.911482 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.933030 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" event={"ID":"eb7d48ad-03ae-4594-8c04-d9afa9dc453c","Type":"ContainerStarted","Data":"3140bd59dd5d0c16a9679ebdea4d763b8e3aea6e6ebe63b5326678590e0b09b7"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.962213 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-879qk" event={"ID":"bcd466a4-1fcd-47e6-9a9c-fae349c4806c","Type":"ContainerStarted","Data":"974072650fc3a5fcba5ced139f09a621b0df3d36dceca374f346928a60124cc1"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.981083 4801 generic.go:334] "Generic (PLEG): container finished" podID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerID="43e0a286c4ed02ffe0ea28c3f5d04e5cf6392f4755bac7623de0ea9dd999c243" exitCode=0 Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.982322 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drqnv" event={"ID":"ca3c0e1d-7cb5-400b-8c40-10fe54d57224","Type":"ContainerDied","Data":"43e0a286c4ed02ffe0ea28c3f5d04e5cf6392f4755bac7623de0ea9dd999c243"} Jan 22 14:06:40 crc kubenswrapper[4801]: I0122 14:06:40.983652 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-skqvg"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.011435 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jxp9w" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.011490 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skqvg"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.011586 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.051207 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" podStartSLOduration=130.051181465 podStartE2EDuration="2m10.051181465s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:41.009982254 +0000 UTC m=+149.711882447" watchObservedRunningTime="2026-01-22 14:06:41.051181465 +0000 UTC m=+149.753081658" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.064974 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.088360 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbghp\" (UniqueName: \"kubernetes.io/projected/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-kube-api-access-tbghp\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.088494 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-catalog-content\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.088563 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-utilities\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.104216 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-879qk" podStartSLOduration=13.104198395 podStartE2EDuration="13.104198395s" podCreationTimestamp="2026-01-22 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:41.049795035 +0000 UTC m=+149.751695218" watchObservedRunningTime="2026-01-22 14:06:41.104198395 +0000 UTC m=+149.806098578" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.192017 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-catalog-content\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.192092 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-utilities\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.192169 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbghp\" (UniqueName: \"kubernetes.io/projected/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-kube-api-access-tbghp\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.192660 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-catalog-content\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.192933 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-utilities\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.241932 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbghp\" (UniqueName: \"kubernetes.io/projected/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-kube-api-access-tbghp\") pod \"redhat-marketplace-skqvg\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.360283 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.402709 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.465591 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkd9x"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.603346 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.786638 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tk862"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.787787 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.787991 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tk862"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.790373 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.794340 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skqvg"] Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.813005 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-catalog-content\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.813117 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljkk2\" (UniqueName: \"kubernetes.io/projected/492ffad8-460b-46f1-b566-9e1cce5cbcb0-kube-api-access-ljkk2\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.813139 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-utilities\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.842369 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:41 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:41 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:41 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.842419 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.914327 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljkk2\" (UniqueName: \"kubernetes.io/projected/492ffad8-460b-46f1-b566-9e1cce5cbcb0-kube-api-access-ljkk2\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.914394 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-utilities\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.914440 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-catalog-content\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.915193 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-catalog-content\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.915760 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-utilities\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:41 crc kubenswrapper[4801]: I0122 14:06:41.970874 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljkk2\" (UniqueName: \"kubernetes.io/projected/492ffad8-460b-46f1-b566-9e1cce5cbcb0-kube-api-access-ljkk2\") pod \"redhat-operators-tk862\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.020630 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5","Type":"ContainerStarted","Data":"e86637cf6d6e73a0fe8feec27310a24b70168a1da8a765b71144c591bab14740"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.036923 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerStarted","Data":"5aa17bd75fbe663554b4c03f4708094e97e603d0d7d0b79f0ec71a54af1082eb"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.053816 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" event={"ID":"25f78c50-1bd7-4abd-b231-a932aa15f2af","Type":"ContainerStarted","Data":"9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.053859 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" event={"ID":"25f78c50-1bd7-4abd-b231-a932aa15f2af","Type":"ContainerStarted","Data":"eb5208e2098a19caa73387149dc2069eafac0ec4e5adc52c1eca25a990af40d1"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.054594 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.056290 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e4f297c080eec09c7d71bdcde98c3a2d0d28903fe96005c36ae9bca79dd3f40c"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.056360 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2c29a5442238293795cc64f2c1b9f9d6f191c46bc14c178d48e501f93316d619"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.059347 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerStarted","Data":"25beb75e155fa4d2ed2de7ce883421880a66b352ae0f7b441cae4f3052ed071b"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.059376 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerStarted","Data":"bb60979a1dff206b06e5af275da6472950331df67f6302d30f85b8fe9d38973d"} Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.119496 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" podStartSLOduration=131.119400607 podStartE2EDuration="2m11.119400607s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:42.094468692 +0000 UTC m=+150.796368885" watchObservedRunningTime="2026-01-22 14:06:42.119400607 +0000 UTC m=+150.821300810" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.136624 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gsvrn"] Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.138030 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.139907 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.170803 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gsvrn"] Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.328234 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-utilities\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.328662 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvkh\" (UniqueName: \"kubernetes.io/projected/8fad3b3a-1004-410f-9058-bddf49f4be78-kube-api-access-6lvkh\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.328692 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-catalog-content\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.423432 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tk862"] Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.429513 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvkh\" (UniqueName: \"kubernetes.io/projected/8fad3b3a-1004-410f-9058-bddf49f4be78-kube-api-access-6lvkh\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.429557 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-catalog-content\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.429647 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-utilities\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.430270 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-utilities\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.431141 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-catalog-content\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.432552 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.432643 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.439617 4801 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vcjmg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]log ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]etcd ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/generic-apiserver-start-informers ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/max-in-flight-filter ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 22 14:06:42 crc kubenswrapper[4801]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 22 14:06:42 crc kubenswrapper[4801]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/project.openshift.io-projectcache ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/openshift.io-startinformers ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 22 14:06:42 crc kubenswrapper[4801]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 22 14:06:42 crc kubenswrapper[4801]: livez check failed Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.439671 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" podUID="eb7d48ad-03ae-4594-8c04-d9afa9dc453c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.451547 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvkh\" (UniqueName: \"kubernetes.io/projected/8fad3b3a-1004-410f-9058-bddf49f4be78-kube-api-access-6lvkh\") pod \"redhat-operators-gsvrn\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.453754 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:06:42 crc kubenswrapper[4801]: W0122 14:06:42.479591 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod492ffad8_460b_46f1_b566_9e1cce5cbcb0.slice/crio-53dac857eaf59bdcd489024087cbebb9141b1f114457eb8a9b83b8c39a8e43d0 WatchSource:0}: Error finding container 53dac857eaf59bdcd489024087cbebb9141b1f114457eb8a9b83b8c39a8e43d0: Status 404 returned error can't find the container with id 53dac857eaf59bdcd489024087cbebb9141b1f114457eb8a9b83b8c39a8e43d0 Jan 22 14:06:42 crc kubenswrapper[4801]: E0122 14:06:42.580025 4801 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod41a8ebc4_e9ab_4a0f_9d53_2e1d7c23d7d5.slice/crio-conmon-f35054a073d04197cdda84791067cab6d9e7d3d7acbf93efa53b9e02227167fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod41a8ebc4_e9ab_4a0f_9d53_2e1d7c23d7d5.slice/crio-f35054a073d04197cdda84791067cab6d9e7d3d7acbf93efa53b9e02227167fc.scope\": RecentStats: unable to find data in memory cache]" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.844850 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:42 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:42 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:42 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.845312 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.885278 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.886148 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.889546 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.889443 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 22 14:06:42 crc kubenswrapper[4801]: I0122 14:06:42.892668 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.011191 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gsvrn"] Jan 22 14:06:43 crc kubenswrapper[4801]: W0122 14:06:43.018422 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fad3b3a_1004_410f_9058_bddf49f4be78.slice/crio-b25e49bb3da4b5227601a3d7a8c796d547a6ee99e8237bc485366b904f0d9e56 WatchSource:0}: Error finding container b25e49bb3da4b5227601a3d7a8c796d547a6ee99e8237bc485366b904f0d9e56: Status 404 returned error can't find the container with id b25e49bb3da4b5227601a3d7a8c796d547a6ee99e8237bc485366b904f0d9e56 Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.039492 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c8f17bb-aac8-40a0-bedb-6cda58165103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.039547 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8f17bb-aac8-40a0-bedb-6cda58165103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.067675 4801 generic.go:334] "Generic (PLEG): container finished" podID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerID="7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016" exitCode=0 Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.067797 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerDied","Data":"7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016"} Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.070234 4801 generic.go:334] "Generic (PLEG): container finished" podID="134f3572-e2f4-4817-93c5-e93c68b761de" containerID="25beb75e155fa4d2ed2de7ce883421880a66b352ae0f7b441cae4f3052ed071b" exitCode=0 Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.070275 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerDied","Data":"25beb75e155fa4d2ed2de7ce883421880a66b352ae0f7b441cae4f3052ed071b"} Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.077332 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsvrn" event={"ID":"8fad3b3a-1004-410f-9058-bddf49f4be78","Type":"ContainerStarted","Data":"b25e49bb3da4b5227601a3d7a8c796d547a6ee99e8237bc485366b904f0d9e56"} Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.079920 4801 generic.go:334] "Generic (PLEG): container finished" podID="41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5" containerID="f35054a073d04197cdda84791067cab6d9e7d3d7acbf93efa53b9e02227167fc" exitCode=0 Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.080045 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5","Type":"ContainerDied","Data":"f35054a073d04197cdda84791067cab6d9e7d3d7acbf93efa53b9e02227167fc"} Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.086241 4801 generic.go:334] "Generic (PLEG): container finished" podID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerID="0aa0d456c3eca4be34a250b5b30a17825ea50f5a2c87a8a3741bcf7c837a5ae4" exitCode=0 Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.086649 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk862" event={"ID":"492ffad8-460b-46f1-b566-9e1cce5cbcb0","Type":"ContainerDied","Data":"0aa0d456c3eca4be34a250b5b30a17825ea50f5a2c87a8a3741bcf7c837a5ae4"} Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.089568 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk862" event={"ID":"492ffad8-460b-46f1-b566-9e1cce5cbcb0","Type":"ContainerStarted","Data":"53dac857eaf59bdcd489024087cbebb9141b1f114457eb8a9b83b8c39a8e43d0"} Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.144075 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c8f17bb-aac8-40a0-bedb-6cda58165103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.144142 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8f17bb-aac8-40a0-bedb-6cda58165103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.145817 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8f17bb-aac8-40a0-bedb-6cda58165103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.166625 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c8f17bb-aac8-40a0-bedb-6cda58165103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.255867 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.785142 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.843922 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:43 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:43 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:43 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:43 crc kubenswrapper[4801]: I0122 14:06:43.843983 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.102764 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c8f17bb-aac8-40a0-bedb-6cda58165103","Type":"ContainerStarted","Data":"6b3737262f8593b0017b1c114990283e17f911a90f215f4f7eda361b1184e141"} Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.104416 4801 generic.go:334] "Generic (PLEG): container finished" podID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerID="883a3bf9cb6cf6f62b1ac8de82d3f965aeb10c61f523ee66899cc19f1e050f62" exitCode=0 Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.105405 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsvrn" event={"ID":"8fad3b3a-1004-410f-9058-bddf49f4be78","Type":"ContainerDied","Data":"883a3bf9cb6cf6f62b1ac8de82d3f965aeb10c61f523ee66899cc19f1e050f62"} Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.449337 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.590264 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kube-api-access\") pod \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.590389 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kubelet-dir\") pod \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\" (UID: \"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5\") " Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.590497 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5" (UID: "41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.591146 4801 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.598148 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5" (UID: "41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.694307 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.842157 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:44 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:44 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:44 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:44 crc kubenswrapper[4801]: I0122 14:06:44.842244 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:45 crc kubenswrapper[4801]: I0122 14:06:45.127497 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 14:06:45 crc kubenswrapper[4801]: I0122 14:06:45.127482 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5","Type":"ContainerDied","Data":"e86637cf6d6e73a0fe8feec27310a24b70168a1da8a765b71144c591bab14740"} Jan 22 14:06:45 crc kubenswrapper[4801]: I0122 14:06:45.127621 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e86637cf6d6e73a0fe8feec27310a24b70168a1da8a765b71144c591bab14740" Jan 22 14:06:45 crc kubenswrapper[4801]: I0122 14:06:45.129780 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c8f17bb-aac8-40a0-bedb-6cda58165103","Type":"ContainerStarted","Data":"967e08a380256c0f1ff1fb7a15594710f4327ed396564dd124704121207c1682"} Jan 22 14:06:45 crc kubenswrapper[4801]: I0122 14:06:45.841601 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:45 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:45 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:45 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:45 crc kubenswrapper[4801]: I0122 14:06:45.841670 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:46 crc kubenswrapper[4801]: I0122 14:06:46.069145 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gw2wt" Jan 22 14:06:46 crc kubenswrapper[4801]: I0122 14:06:46.161632 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.161615719 podStartE2EDuration="4.161615719s" podCreationTimestamp="2026-01-22 14:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:06:46.150803079 +0000 UTC m=+154.852703262" watchObservedRunningTime="2026-01-22 14:06:46.161615719 +0000 UTC m=+154.863515902" Jan 22 14:06:46 crc kubenswrapper[4801]: I0122 14:06:46.841148 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:46 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:46 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:46 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:46 crc kubenswrapper[4801]: I0122 14:06:46.841209 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:47 crc kubenswrapper[4801]: I0122 14:06:47.157497 4801 generic.go:334] "Generic (PLEG): container finished" podID="1c8f17bb-aac8-40a0-bedb-6cda58165103" containerID="967e08a380256c0f1ff1fb7a15594710f4327ed396564dd124704121207c1682" exitCode=0 Jan 22 14:06:47 crc kubenswrapper[4801]: I0122 14:06:47.157540 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c8f17bb-aac8-40a0-bedb-6cda58165103","Type":"ContainerDied","Data":"967e08a380256c0f1ff1fb7a15594710f4327ed396564dd124704121207c1682"} Jan 22 14:06:47 crc kubenswrapper[4801]: I0122 14:06:47.439732 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:47 crc kubenswrapper[4801]: I0122 14:06:47.444586 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vcjmg" Jan 22 14:06:47 crc kubenswrapper[4801]: I0122 14:06:47.846549 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:47 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:47 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:47 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:47 crc kubenswrapper[4801]: I0122 14:06:47.846673 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.394622 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.572263 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c8f17bb-aac8-40a0-bedb-6cda58165103-kube-api-access\") pod \"1c8f17bb-aac8-40a0-bedb-6cda58165103\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.572338 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8f17bb-aac8-40a0-bedb-6cda58165103-kubelet-dir\") pod \"1c8f17bb-aac8-40a0-bedb-6cda58165103\" (UID: \"1c8f17bb-aac8-40a0-bedb-6cda58165103\") " Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.572602 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c8f17bb-aac8-40a0-bedb-6cda58165103-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c8f17bb-aac8-40a0-bedb-6cda58165103" (UID: "1c8f17bb-aac8-40a0-bedb-6cda58165103"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.587388 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8f17bb-aac8-40a0-bedb-6cda58165103-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c8f17bb-aac8-40a0-bedb-6cda58165103" (UID: "1c8f17bb-aac8-40a0-bedb-6cda58165103"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.674127 4801 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c8f17bb-aac8-40a0-bedb-6cda58165103-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.674193 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c8f17bb-aac8-40a0-bedb-6cda58165103-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.841843 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:48 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:48 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:48 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:48 crc kubenswrapper[4801]: I0122 14:06:48.841895 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:49 crc kubenswrapper[4801]: I0122 14:06:49.175806 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c8f17bb-aac8-40a0-bedb-6cda58165103","Type":"ContainerDied","Data":"6b3737262f8593b0017b1c114990283e17f911a90f215f4f7eda361b1184e141"} Jan 22 14:06:49 crc kubenswrapper[4801]: I0122 14:06:49.176169 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b3737262f8593b0017b1c114990283e17f911a90f215f4f7eda361b1184e141" Jan 22 14:06:49 crc kubenswrapper[4801]: I0122 14:06:49.176264 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 14:06:49 crc kubenswrapper[4801]: I0122 14:06:49.842046 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:49 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:49 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:49 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:49 crc kubenswrapper[4801]: I0122 14:06:49.842106 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:50 crc kubenswrapper[4801]: I0122 14:06:50.714164 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-sx6df" Jan 22 14:06:50 crc kubenswrapper[4801]: I0122 14:06:50.768348 4801 patch_prober.go:28] interesting pod/console-f9d7485db-nbhfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 22 14:06:50 crc kubenswrapper[4801]: I0122 14:06:50.768399 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nbhfx" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 22 14:06:50 crc kubenswrapper[4801]: I0122 14:06:50.845651 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:50 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:50 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:50 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:50 crc kubenswrapper[4801]: I0122 14:06:50.845722 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:51 crc kubenswrapper[4801]: I0122 14:06:51.841480 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:51 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:51 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:51 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:51 crc kubenswrapper[4801]: I0122 14:06:51.841539 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:52 crc kubenswrapper[4801]: I0122 14:06:52.848034 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:52 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:52 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:52 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:52 crc kubenswrapper[4801]: I0122 14:06:52.848339 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:53 crc kubenswrapper[4801]: I0122 14:06:53.793247 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:53 crc kubenswrapper[4801]: I0122 14:06:53.818472 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18bcc554-da90-40e9-b32f-e0d5d0936faa-metrics-certs\") pod \"network-metrics-daemon-ph2s5\" (UID: \"18bcc554-da90-40e9-b32f-e0d5d0936faa\") " pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:53 crc kubenswrapper[4801]: I0122 14:06:53.851423 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:53 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:53 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:53 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:53 crc kubenswrapper[4801]: I0122 14:06:53.851503 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:54 crc kubenswrapper[4801]: I0122 14:06:54.085755 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ph2s5" Jan 22 14:06:54 crc kubenswrapper[4801]: I0122 14:06:54.842967 4801 patch_prober.go:28] interesting pod/router-default-5444994796-57pdz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 14:06:54 crc kubenswrapper[4801]: [-]has-synced failed: reason withheld Jan 22 14:06:54 crc kubenswrapper[4801]: [+]process-running ok Jan 22 14:06:54 crc kubenswrapper[4801]: healthz check failed Jan 22 14:06:54 crc kubenswrapper[4801]: I0122 14:06:54.843019 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-57pdz" podUID="9f30eedd-eee4-43f7-a1dd-4614580c7f0a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 14:06:55 crc kubenswrapper[4801]: I0122 14:06:55.850009 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:06:55 crc kubenswrapper[4801]: I0122 14:06:55.854235 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-57pdz" Jan 22 14:07:00 crc kubenswrapper[4801]: I0122 14:07:00.299374 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:07:00 crc kubenswrapper[4801]: I0122 14:07:00.774922 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:07:00 crc kubenswrapper[4801]: I0122 14:07:00.779720 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:07:04 crc kubenswrapper[4801]: I0122 14:07:04.021516 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:07:04 crc kubenswrapper[4801]: I0122 14:07:04.021856 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:07:10 crc kubenswrapper[4801]: I0122 14:07:10.986958 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5xtp" Jan 22 14:07:17 crc kubenswrapper[4801]: E0122 14:07:17.297408 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 22 14:07:17 crc kubenswrapper[4801]: E0122 14:07:17.298158 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lvkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gsvrn_openshift-marketplace(8fad3b3a-1004-410f-9058-bddf49f4be78): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:17 crc kubenswrapper[4801]: E0122 14:07:17.300420 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gsvrn" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" Jan 22 14:07:18 crc kubenswrapper[4801]: E0122 14:07:18.596384 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gsvrn" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" Jan 22 14:07:18 crc kubenswrapper[4801]: E0122 14:07:18.661207 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 22 14:07:18 crc kubenswrapper[4801]: E0122 14:07:18.661384 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzk6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w9tg8_openshift-marketplace(181b9b48-fc92-4b6d-b7e7-a61fb42b2024): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:18 crc kubenswrapper[4801]: E0122 14:07:18.662568 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w9tg8" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.879147 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 14:07:18 crc kubenswrapper[4801]: E0122 14:07:18.879404 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5" containerName="pruner" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.879420 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5" containerName="pruner" Jan 22 14:07:18 crc kubenswrapper[4801]: E0122 14:07:18.879438 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8f17bb-aac8-40a0-bedb-6cda58165103" containerName="pruner" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.879461 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8f17bb-aac8-40a0-bedb-6cda58165103" containerName="pruner" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.879584 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a8ebc4-e9ab-4a0f-9d53-2e1d7c23d7d5" containerName="pruner" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.879602 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8f17bb-aac8-40a0-bedb-6cda58165103" containerName="pruner" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.880004 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.881678 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.881752 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.892744 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.975176 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c2a742-83c5-44eb-9470-7b802883f5b6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:18 crc kubenswrapper[4801]: I0122 14:07:18.975345 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0c2a742-83c5-44eb-9470-7b802883f5b6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:19 crc kubenswrapper[4801]: I0122 14:07:19.076200 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c2a742-83c5-44eb-9470-7b802883f5b6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:19 crc kubenswrapper[4801]: I0122 14:07:19.076255 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0c2a742-83c5-44eb-9470-7b802883f5b6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:19 crc kubenswrapper[4801]: I0122 14:07:19.076316 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0c2a742-83c5-44eb-9470-7b802883f5b6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:19 crc kubenswrapper[4801]: I0122 14:07:19.100279 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c2a742-83c5-44eb-9470-7b802883f5b6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:19 crc kubenswrapper[4801]: I0122 14:07:19.220799 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:19 crc kubenswrapper[4801]: I0122 14:07:19.503891 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 14:07:19 crc kubenswrapper[4801]: E0122 14:07:19.888804 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w9tg8" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" Jan 22 14:07:19 crc kubenswrapper[4801]: E0122 14:07:19.953055 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 22 14:07:19 crc kubenswrapper[4801]: E0122 14:07:19.953203 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tbghp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-skqvg_openshift-marketplace(c9a37ba1-7039-4287-a2ae-3ba3d028dc96): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:19 crc kubenswrapper[4801]: E0122 14:07:19.954382 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-skqvg" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.135133 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-skqvg" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.210410 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.210781 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-phqnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-w7hkb_openshift-marketplace(cf2e297a-8bf5-447a-97e5-2d7cff43ed0c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.216324 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-w7hkb" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.234866 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.235028 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2lkw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nkd9x_openshift-marketplace(134f3572-e2f4-4817-93c5-e93c68b761de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.236281 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nkd9x" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.242699 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.242834 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dn8tl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-drqnv_openshift-marketplace(ca3c0e1d-7cb5-400b-8c40-10fe54d57224): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.244202 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-drqnv" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.247130 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.247269 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxt5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-87tcn_openshift-marketplace(5544e90b-4784-48bb-9462-5562be0bc923): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.249036 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-87tcn" podUID="5544e90b-4784-48bb-9462-5562be0bc923" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.271176 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.271411 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljkk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tk862_openshift-marketplace(492ffad8-460b-46f1-b566-9e1cce5cbcb0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.272870 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tk862" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.420826 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tk862" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.420887 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-drqnv" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.423694 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-87tcn" podUID="5544e90b-4784-48bb-9462-5562be0bc923" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.423766 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nkd9x" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" Jan 22 14:07:21 crc kubenswrapper[4801]: E0122 14:07:21.423811 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-w7hkb" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" Jan 22 14:07:21 crc kubenswrapper[4801]: I0122 14:07:21.565076 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 14:07:21 crc kubenswrapper[4801]: I0122 14:07:21.590327 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ph2s5"] Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.416268 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" event={"ID":"18bcc554-da90-40e9-b32f-e0d5d0936faa","Type":"ContainerStarted","Data":"398c15b2a0451f7587c14cd7fb1c2183df20cc2b4073aa054fc6a110ea7fbfbe"} Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.416957 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" event={"ID":"18bcc554-da90-40e9-b32f-e0d5d0936faa","Type":"ContainerStarted","Data":"fd1b6322e623867bb50bf1b1b9d27850b01647f39af8b648e53e993e305a280d"} Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.416976 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ph2s5" event={"ID":"18bcc554-da90-40e9-b32f-e0d5d0936faa","Type":"ContainerStarted","Data":"3ac2b5136657a0795327cea76c7234d96f85bb0d885b7e917ac6e6b95a03fffa"} Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.420097 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0c2a742-83c5-44eb-9470-7b802883f5b6","Type":"ContainerStarted","Data":"edec570b517a8bd44f6d9826300a0df4a10ed973ebf9f52ebe2f6b6eae671b35"} Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.420161 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0c2a742-83c5-44eb-9470-7b802883f5b6","Type":"ContainerStarted","Data":"e18a6cc05e274b5e7b66df6e733078b3d75087aa15a4cb7083ce611629cd8655"} Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.435291 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ph2s5" podStartSLOduration=171.435268387 podStartE2EDuration="2m51.435268387s" podCreationTimestamp="2026-01-22 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:22.430919012 +0000 UTC m=+191.132819215" watchObservedRunningTime="2026-01-22 14:07:22.435268387 +0000 UTC m=+191.137168580" Jan 22 14:07:22 crc kubenswrapper[4801]: I0122 14:07:22.451380 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.451355548 podStartE2EDuration="4.451355548s" podCreationTimestamp="2026-01-22 14:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:22.445966394 +0000 UTC m=+191.147866587" watchObservedRunningTime="2026-01-22 14:07:22.451355548 +0000 UTC m=+191.153255731" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.428310 4801 generic.go:334] "Generic (PLEG): container finished" podID="c0c2a742-83c5-44eb-9470-7b802883f5b6" containerID="edec570b517a8bd44f6d9826300a0df4a10ed973ebf9f52ebe2f6b6eae671b35" exitCode=0 Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.428396 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0c2a742-83c5-44eb-9470-7b802883f5b6","Type":"ContainerDied","Data":"edec570b517a8bd44f6d9826300a0df4a10ed973ebf9f52ebe2f6b6eae671b35"} Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.684369 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.685394 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.687682 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.735111 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbee368a-1c06-459f-b5dc-44269645b020-kube-api-access\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.735187 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-var-lock\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.735228 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.836593 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbee368a-1c06-459f-b5dc-44269645b020-kube-api-access\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.836692 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-var-lock\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.836754 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.836933 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.837480 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-var-lock\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:23 crc kubenswrapper[4801]: I0122 14:07:23.863623 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbee368a-1c06-459f-b5dc-44269645b020-kube-api-access\") pod \"installer-9-crc\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.019249 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.414063 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.434949 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbee368a-1c06-459f-b5dc-44269645b020","Type":"ContainerStarted","Data":"50ec26ecdbd1bb3ab3c238f81bfec031547a5ddcc4de12cab9b20e9ea48a1681"} Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.709939 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.848055 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0c2a742-83c5-44eb-9470-7b802883f5b6-kubelet-dir\") pod \"c0c2a742-83c5-44eb-9470-7b802883f5b6\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.848145 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c2a742-83c5-44eb-9470-7b802883f5b6-kube-api-access\") pod \"c0c2a742-83c5-44eb-9470-7b802883f5b6\" (UID: \"c0c2a742-83c5-44eb-9470-7b802883f5b6\") " Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.849038 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0c2a742-83c5-44eb-9470-7b802883f5b6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c0c2a742-83c5-44eb-9470-7b802883f5b6" (UID: "c0c2a742-83c5-44eb-9470-7b802883f5b6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.857269 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c2a742-83c5-44eb-9470-7b802883f5b6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c0c2a742-83c5-44eb-9470-7b802883f5b6" (UID: "c0c2a742-83c5-44eb-9470-7b802883f5b6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.950040 4801 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0c2a742-83c5-44eb-9470-7b802883f5b6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:24 crc kubenswrapper[4801]: I0122 14:07:24.950104 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c2a742-83c5-44eb-9470-7b802883f5b6-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:25 crc kubenswrapper[4801]: I0122 14:07:25.440353 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 14:07:25 crc kubenswrapper[4801]: I0122 14:07:25.440351 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0c2a742-83c5-44eb-9470-7b802883f5b6","Type":"ContainerDied","Data":"e18a6cc05e274b5e7b66df6e733078b3d75087aa15a4cb7083ce611629cd8655"} Jan 22 14:07:25 crc kubenswrapper[4801]: I0122 14:07:25.440408 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18a6cc05e274b5e7b66df6e733078b3d75087aa15a4cb7083ce611629cd8655" Jan 22 14:07:25 crc kubenswrapper[4801]: I0122 14:07:25.441772 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbee368a-1c06-459f-b5dc-44269645b020","Type":"ContainerStarted","Data":"ae96634dab70ce7eb95676e01d086683b3dd55811bdf1d0b992213db7b80b357"} Jan 22 14:07:31 crc kubenswrapper[4801]: I0122 14:07:31.596651 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.596634981 podStartE2EDuration="8.596634981s" podCreationTimestamp="2026-01-22 14:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:07:25.45988313 +0000 UTC m=+194.161783313" watchObservedRunningTime="2026-01-22 14:07:31.596634981 +0000 UTC m=+200.298535164" Jan 22 14:07:32 crc kubenswrapper[4801]: I0122 14:07:32.477312 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerStarted","Data":"3fa0055d99c58d1206f40646d5bbf9a36dbc4cf51d3807a7c48867b98571022d"} Jan 22 14:07:33 crc kubenswrapper[4801]: I0122 14:07:33.485119 4801 generic.go:334] "Generic (PLEG): container finished" podID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerID="3fa0055d99c58d1206f40646d5bbf9a36dbc4cf51d3807a7c48867b98571022d" exitCode=0 Jan 22 14:07:33 crc kubenswrapper[4801]: I0122 14:07:33.485217 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerDied","Data":"3fa0055d99c58d1206f40646d5bbf9a36dbc4cf51d3807a7c48867b98571022d"} Jan 22 14:07:33 crc kubenswrapper[4801]: I0122 14:07:33.488585 4801 generic.go:334] "Generic (PLEG): container finished" podID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerID="f386d36e5df7c8d8c33b86a456865ce8e7a9be78a58a988694b4291740d07b27" exitCode=0 Jan 22 14:07:33 crc kubenswrapper[4801]: I0122 14:07:33.488615 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7hkb" event={"ID":"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c","Type":"ContainerDied","Data":"f386d36e5df7c8d8c33b86a456865ce8e7a9be78a58a988694b4291740d07b27"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.020517 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.020805 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.020843 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.021201 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.021286 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82" gracePeriod=600 Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.504648 4801 generic.go:334] "Generic (PLEG): container finished" podID="134f3572-e2f4-4817-93c5-e93c68b761de" containerID="7762490a9c1e6a87b3617bd1e3025e5db0da2c8b193431d867128fdb599eebfb" exitCode=0 Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.504708 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerDied","Data":"7762490a9c1e6a87b3617bd1e3025e5db0da2c8b193431d867128fdb599eebfb"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.514716 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7hkb" event={"ID":"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c","Type":"ContainerStarted","Data":"a3d0bb3cbc46eee6708a1f1f9cfc8ecf69d8361ea81f56c81420659d0575f607"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.521441 4801 generic.go:334] "Generic (PLEG): container finished" podID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerID="cf6ad6da154a5a2bf22ccb93f68b1eaa98ae63d01a4906380984c35662cabac2" exitCode=0 Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.521559 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drqnv" event={"ID":"ca3c0e1d-7cb5-400b-8c40-10fe54d57224","Type":"ContainerDied","Data":"cf6ad6da154a5a2bf22ccb93f68b1eaa98ae63d01a4906380984c35662cabac2"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.524971 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82" exitCode=0 Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.525018 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.525041 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"05f9bc84ca95ce886a74f9a3709eccbccd35ca90eb20e5f1f29ed5cdd447357e"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.530315 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerStarted","Data":"ca3815bbeef0e49ed9075e63cc27e6088effe75674e2af44b6d6307b03120868"} Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.573910 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w7hkb" podStartSLOduration=2.392830891 podStartE2EDuration="55.573894423s" podCreationTimestamp="2026-01-22 14:06:39 +0000 UTC" firstStartedPulling="2026-01-22 14:06:40.743826275 +0000 UTC m=+149.445726458" lastFinishedPulling="2026-01-22 14:07:33.924889807 +0000 UTC m=+202.626789990" observedRunningTime="2026-01-22 14:07:34.571560076 +0000 UTC m=+203.273460269" watchObservedRunningTime="2026-01-22 14:07:34.573894423 +0000 UTC m=+203.275794606" Jan 22 14:07:34 crc kubenswrapper[4801]: I0122 14:07:34.648321 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9tg8" podStartSLOduration=3.547673368 podStartE2EDuration="56.648305282s" podCreationTimestamp="2026-01-22 14:06:38 +0000 UTC" firstStartedPulling="2026-01-22 14:06:40.912880061 +0000 UTC m=+149.614780244" lastFinishedPulling="2026-01-22 14:07:34.013511985 +0000 UTC m=+202.715412158" observedRunningTime="2026-01-22 14:07:34.643169535 +0000 UTC m=+203.345069748" watchObservedRunningTime="2026-01-22 14:07:34.648305282 +0000 UTC m=+203.350205465" Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.541419 4801 generic.go:334] "Generic (PLEG): container finished" podID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerID="02a0eb926a948c836aeed2c10d2934dd12c763e3f30621872ebe328e56a0c8e1" exitCode=0 Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.541496 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsvrn" event={"ID":"8fad3b3a-1004-410f-9058-bddf49f4be78","Type":"ContainerDied","Data":"02a0eb926a948c836aeed2c10d2934dd12c763e3f30621872ebe328e56a0c8e1"} Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.544533 4801 generic.go:334] "Generic (PLEG): container finished" podID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerID="ba11f4a6ec85ace6916a697aaa6b17f56031308791adc418ae0cd5ccfb1b62db" exitCode=0 Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.544604 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk862" event={"ID":"492ffad8-460b-46f1-b566-9e1cce5cbcb0","Type":"ContainerDied","Data":"ba11f4a6ec85ace6916a697aaa6b17f56031308791adc418ae0cd5ccfb1b62db"} Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.554626 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerStarted","Data":"6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928"} Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.567907 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drqnv" event={"ID":"ca3c0e1d-7cb5-400b-8c40-10fe54d57224","Type":"ContainerStarted","Data":"c87a2ab20204af16a6d81b2c5214858e5f3286e56ddf9ade8fe4a37d6cf0c86b"} Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.600594 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nkd9x" podStartSLOduration=2.649977983 podStartE2EDuration="55.600577216s" podCreationTimestamp="2026-01-22 14:06:40 +0000 UTC" firstStartedPulling="2026-01-22 14:06:42.06091954 +0000 UTC m=+150.762819723" lastFinishedPulling="2026-01-22 14:07:35.011518773 +0000 UTC m=+203.713418956" observedRunningTime="2026-01-22 14:07:35.592852484 +0000 UTC m=+204.294752667" watchObservedRunningTime="2026-01-22 14:07:35.600577216 +0000 UTC m=+204.302477399" Jan 22 14:07:35 crc kubenswrapper[4801]: I0122 14:07:35.637464 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-drqnv" podStartSLOduration=3.45271095 podStartE2EDuration="57.637429885s" podCreationTimestamp="2026-01-22 14:06:38 +0000 UTC" firstStartedPulling="2026-01-22 14:06:41.009051437 +0000 UTC m=+149.710951620" lastFinishedPulling="2026-01-22 14:07:35.193770372 +0000 UTC m=+203.895670555" observedRunningTime="2026-01-22 14:07:35.635638314 +0000 UTC m=+204.337538507" watchObservedRunningTime="2026-01-22 14:07:35.637429885 +0000 UTC m=+204.339330068" Jan 22 14:07:36 crc kubenswrapper[4801]: I0122 14:07:36.578903 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsvrn" event={"ID":"8fad3b3a-1004-410f-9058-bddf49f4be78","Type":"ContainerStarted","Data":"402c62de30af11b3ae2d6c8231dc2b0ef7ad82caba57dfa5a3853eb222ce88fe"} Jan 22 14:07:36 crc kubenswrapper[4801]: I0122 14:07:36.581080 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk862" event={"ID":"492ffad8-460b-46f1-b566-9e1cce5cbcb0","Type":"ContainerStarted","Data":"fc56c87d6f7ca7e129f80456c71d84b31ae945343cbdb23d79a284b87d31c230"} Jan 22 14:07:36 crc kubenswrapper[4801]: I0122 14:07:36.583059 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerStarted","Data":"e60b5c8afcae40170411fcae6badef3b2d427d549604433a2e0eac6787d33963"} Jan 22 14:07:36 crc kubenswrapper[4801]: I0122 14:07:36.627934 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tk862" podStartSLOduration=2.800517272 podStartE2EDuration="55.627914366s" podCreationTimestamp="2026-01-22 14:06:41 +0000 UTC" firstStartedPulling="2026-01-22 14:06:43.092643225 +0000 UTC m=+151.794543408" lastFinishedPulling="2026-01-22 14:07:35.920040319 +0000 UTC m=+204.621940502" observedRunningTime="2026-01-22 14:07:36.620644897 +0000 UTC m=+205.322545080" watchObservedRunningTime="2026-01-22 14:07:36.627914366 +0000 UTC m=+205.329814549" Jan 22 14:07:36 crc kubenswrapper[4801]: I0122 14:07:36.658339 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gsvrn" podStartSLOduration=2.81557803 podStartE2EDuration="54.65831926s" podCreationTimestamp="2026-01-22 14:06:42 +0000 UTC" firstStartedPulling="2026-01-22 14:06:44.107771594 +0000 UTC m=+152.809671777" lastFinishedPulling="2026-01-22 14:07:35.950512824 +0000 UTC m=+204.652413007" observedRunningTime="2026-01-22 14:07:36.656683843 +0000 UTC m=+205.358584036" watchObservedRunningTime="2026-01-22 14:07:36.65831926 +0000 UTC m=+205.360219443" Jan 22 14:07:37 crc kubenswrapper[4801]: I0122 14:07:37.590390 4801 generic.go:334] "Generic (PLEG): container finished" podID="5544e90b-4784-48bb-9462-5562be0bc923" containerID="e60b5c8afcae40170411fcae6badef3b2d427d549604433a2e0eac6787d33963" exitCode=0 Jan 22 14:07:37 crc kubenswrapper[4801]: I0122 14:07:37.590538 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerDied","Data":"e60b5c8afcae40170411fcae6badef3b2d427d549604433a2e0eac6787d33963"} Jan 22 14:07:37 crc kubenswrapper[4801]: I0122 14:07:37.595798 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerStarted","Data":"1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4"} Jan 22 14:07:38 crc kubenswrapper[4801]: I0122 14:07:38.603330 4801 generic.go:334] "Generic (PLEG): container finished" podID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerID="1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4" exitCode=0 Jan 22 14:07:38 crc kubenswrapper[4801]: I0122 14:07:38.603385 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerDied","Data":"1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4"} Jan 22 14:07:38 crc kubenswrapper[4801]: I0122 14:07:38.877423 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:07:38 crc kubenswrapper[4801]: I0122 14:07:38.877757 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.239762 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.289596 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.289644 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.490826 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.490868 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.607075 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.611063 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.646746 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:07:39 crc kubenswrapper[4801]: I0122 14:07:39.661931 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:07:40 crc kubenswrapper[4801]: I0122 14:07:40.809841 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7hkb"] Jan 22 14:07:40 crc kubenswrapper[4801]: I0122 14:07:40.912564 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:07:40 crc kubenswrapper[4801]: I0122 14:07:40.912949 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:07:40 crc kubenswrapper[4801]: I0122 14:07:40.965523 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:07:41 crc kubenswrapper[4801]: I0122 14:07:41.621077 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w7hkb" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="registry-server" containerID="cri-o://a3d0bb3cbc46eee6708a1f1f9cfc8ecf69d8361ea81f56c81420659d0575f607" gracePeriod=2 Jan 22 14:07:41 crc kubenswrapper[4801]: I0122 14:07:41.709734 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.140727 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.141304 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.189127 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.454070 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.454786 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.512305 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.627885 4801 generic.go:334] "Generic (PLEG): container finished" podID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerID="a3d0bb3cbc46eee6708a1f1f9cfc8ecf69d8361ea81f56c81420659d0575f607" exitCode=0 Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.627945 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7hkb" event={"ID":"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c","Type":"ContainerDied","Data":"a3d0bb3cbc46eee6708a1f1f9cfc8ecf69d8361ea81f56c81420659d0575f607"} Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.630579 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerStarted","Data":"68b378ee9c21fcb9eeef765fabad48c3b0a035104b0baabbd4b43e97d71ef58e"} Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.649528 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-87tcn" podStartSLOduration=4.020596204 podStartE2EDuration="1m4.64951087s" podCreationTimestamp="2026-01-22 14:06:38 +0000 UTC" firstStartedPulling="2026-01-22 14:06:40.904963484 +0000 UTC m=+149.606863667" lastFinishedPulling="2026-01-22 14:07:41.53387815 +0000 UTC m=+210.235778333" observedRunningTime="2026-01-22 14:07:42.64778753 +0000 UTC m=+211.349687723" watchObservedRunningTime="2026-01-22 14:07:42.64951087 +0000 UTC m=+211.351411053" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.678481 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.681354 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.718199 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.781462 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-catalog-content\") pod \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.781590 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-utilities\") pod \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.781627 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phqnx\" (UniqueName: \"kubernetes.io/projected/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-kube-api-access-phqnx\") pod \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\" (UID: \"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c\") " Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.783613 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-utilities" (OuterVolumeSpecName: "utilities") pod "cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" (UID: "cf2e297a-8bf5-447a-97e5-2d7cff43ed0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.786261 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-kube-api-access-phqnx" (OuterVolumeSpecName: "kube-api-access-phqnx") pod "cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" (UID: "cf2e297a-8bf5-447a-97e5-2d7cff43ed0c"). InnerVolumeSpecName "kube-api-access-phqnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.844351 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" (UID: "cf2e297a-8bf5-447a-97e5-2d7cff43ed0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.884129 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.884162 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:42 crc kubenswrapper[4801]: I0122 14:07:42.884175 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phqnx\" (UniqueName: \"kubernetes.io/projected/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c-kube-api-access-phqnx\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.206907 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9tg8"] Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.207940 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w9tg8" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="registry-server" containerID="cri-o://ca3815bbeef0e49ed9075e63cc27e6088effe75674e2af44b6d6307b03120868" gracePeriod=2 Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.638531 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7hkb" event={"ID":"cf2e297a-8bf5-447a-97e5-2d7cff43ed0c","Type":"ContainerDied","Data":"95917f7645ca3103073b1fedc299b2beab905eddfb6926dae3668e01f4473e2f"} Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.639188 4801 scope.go:117] "RemoveContainer" containerID="a3d0bb3cbc46eee6708a1f1f9cfc8ecf69d8361ea81f56c81420659d0575f607" Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.639417 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7hkb" Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.657147 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7hkb"] Jan 22 14:07:43 crc kubenswrapper[4801]: I0122 14:07:43.659961 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w7hkb"] Jan 22 14:07:44 crc kubenswrapper[4801]: I0122 14:07:44.406596 4801 scope.go:117] "RemoveContainer" containerID="f386d36e5df7c8d8c33b86a456865ce8e7a9be78a58a988694b4291740d07b27" Jan 22 14:07:44 crc kubenswrapper[4801]: I0122 14:07:44.443986 4801 scope.go:117] "RemoveContainer" containerID="d69cddb9c1d538a73b06f83799c239abe4357c9409a9e49f43a3bf98979663a5" Jan 22 14:07:45 crc kubenswrapper[4801]: I0122 14:07:45.585002 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" path="/var/lib/kubelet/pods/cf2e297a-8bf5-447a-97e5-2d7cff43ed0c/volumes" Jan 22 14:07:45 crc kubenswrapper[4801]: I0122 14:07:45.663161 4801 generic.go:334] "Generic (PLEG): container finished" podID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerID="ca3815bbeef0e49ed9075e63cc27e6088effe75674e2af44b6d6307b03120868" exitCode=0 Jan 22 14:07:45 crc kubenswrapper[4801]: I0122 14:07:45.663227 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerDied","Data":"ca3815bbeef0e49ed9075e63cc27e6088effe75674e2af44b6d6307b03120868"} Jan 22 14:07:45 crc kubenswrapper[4801]: I0122 14:07:45.893396 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gsvrn"] Jan 22 14:07:45 crc kubenswrapper[4801]: I0122 14:07:45.893712 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gsvrn" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="registry-server" containerID="cri-o://402c62de30af11b3ae2d6c8231dc2b0ef7ad82caba57dfa5a3853eb222ce88fe" gracePeriod=2 Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.237831 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.428843 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzk6v\" (UniqueName: \"kubernetes.io/projected/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-kube-api-access-zzk6v\") pod \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.429917 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-utilities\") pod \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.429974 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-catalog-content\") pod \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\" (UID: \"181b9b48-fc92-4b6d-b7e7-a61fb42b2024\") " Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.431196 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-utilities" (OuterVolumeSpecName: "utilities") pod "181b9b48-fc92-4b6d-b7e7-a61fb42b2024" (UID: "181b9b48-fc92-4b6d-b7e7-a61fb42b2024"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.438726 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-kube-api-access-zzk6v" (OuterVolumeSpecName: "kube-api-access-zzk6v") pod "181b9b48-fc92-4b6d-b7e7-a61fb42b2024" (UID: "181b9b48-fc92-4b6d-b7e7-a61fb42b2024"). InnerVolumeSpecName "kube-api-access-zzk6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.530669 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.530699 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzk6v\" (UniqueName: \"kubernetes.io/projected/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-kube-api-access-zzk6v\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.671860 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerStarted","Data":"119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e"} Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.674488 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9tg8" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.674671 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9tg8" event={"ID":"181b9b48-fc92-4b6d-b7e7-a61fb42b2024","Type":"ContainerDied","Data":"e468fd37eb3ad679dddccfc42ba3d7bdb7c586df0ad3f2e2a2cc38ef92855d1e"} Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.674742 4801 scope.go:117] "RemoveContainer" containerID="ca3815bbeef0e49ed9075e63cc27e6088effe75674e2af44b6d6307b03120868" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.679415 4801 generic.go:334] "Generic (PLEG): container finished" podID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerID="402c62de30af11b3ae2d6c8231dc2b0ef7ad82caba57dfa5a3853eb222ce88fe" exitCode=0 Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.679492 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsvrn" event={"ID":"8fad3b3a-1004-410f-9058-bddf49f4be78","Type":"ContainerDied","Data":"402c62de30af11b3ae2d6c8231dc2b0ef7ad82caba57dfa5a3853eb222ce88fe"} Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.692872 4801 scope.go:117] "RemoveContainer" containerID="3fa0055d99c58d1206f40646d5bbf9a36dbc4cf51d3807a7c48867b98571022d" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.696279 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-skqvg" podStartSLOduration=5.356200222 podStartE2EDuration="1m6.696259883s" podCreationTimestamp="2026-01-22 14:06:40 +0000 UTC" firstStartedPulling="2026-01-22 14:06:43.069204263 +0000 UTC m=+151.771104436" lastFinishedPulling="2026-01-22 14:07:44.409263894 +0000 UTC m=+213.111164097" observedRunningTime="2026-01-22 14:07:46.692257878 +0000 UTC m=+215.394158061" watchObservedRunningTime="2026-01-22 14:07:46.696259883 +0000 UTC m=+215.398160066" Jan 22 14:07:46 crc kubenswrapper[4801]: I0122 14:07:46.709785 4801 scope.go:117] "RemoveContainer" containerID="6570fea44558fe8818439a4a2416e7716172535155c2e9a8025555188a889346" Jan 22 14:07:47 crc kubenswrapper[4801]: I0122 14:07:47.566797 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "181b9b48-fc92-4b6d-b7e7-a61fb42b2024" (UID: "181b9b48-fc92-4b6d-b7e7-a61fb42b2024"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:47 crc kubenswrapper[4801]: I0122 14:07:47.636077 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9tg8"] Jan 22 14:07:47 crc kubenswrapper[4801]: I0122 14:07:47.642850 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w9tg8"] Jan 22 14:07:47 crc kubenswrapper[4801]: I0122 14:07:47.648923 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181b9b48-fc92-4b6d-b7e7-a61fb42b2024-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.176043 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.357792 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lvkh\" (UniqueName: \"kubernetes.io/projected/8fad3b3a-1004-410f-9058-bddf49f4be78-kube-api-access-6lvkh\") pod \"8fad3b3a-1004-410f-9058-bddf49f4be78\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.357991 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-catalog-content\") pod \"8fad3b3a-1004-410f-9058-bddf49f4be78\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.360880 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fad3b3a-1004-410f-9058-bddf49f4be78-kube-api-access-6lvkh" (OuterVolumeSpecName: "kube-api-access-6lvkh") pod "8fad3b3a-1004-410f-9058-bddf49f4be78" (UID: "8fad3b3a-1004-410f-9058-bddf49f4be78"). InnerVolumeSpecName "kube-api-access-6lvkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.361573 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-utilities\") pod \"8fad3b3a-1004-410f-9058-bddf49f4be78\" (UID: \"8fad3b3a-1004-410f-9058-bddf49f4be78\") " Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.362037 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lvkh\" (UniqueName: \"kubernetes.io/projected/8fad3b3a-1004-410f-9058-bddf49f4be78-kube-api-access-6lvkh\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.362225 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-utilities" (OuterVolumeSpecName: "utilities") pod "8fad3b3a-1004-410f-9058-bddf49f4be78" (UID: "8fad3b3a-1004-410f-9058-bddf49f4be78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.463257 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.535795 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fad3b3a-1004-410f-9058-bddf49f4be78" (UID: "8fad3b3a-1004-410f-9058-bddf49f4be78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.564573 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fad3b3a-1004-410f-9058-bddf49f4be78-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.699264 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsvrn" event={"ID":"8fad3b3a-1004-410f-9058-bddf49f4be78","Type":"ContainerDied","Data":"b25e49bb3da4b5227601a3d7a8c796d547a6ee99e8237bc485366b904f0d9e56"} Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.699480 4801 scope.go:117] "RemoveContainer" containerID="402c62de30af11b3ae2d6c8231dc2b0ef7ad82caba57dfa5a3853eb222ce88fe" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.699966 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsvrn" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.729441 4801 scope.go:117] "RemoveContainer" containerID="02a0eb926a948c836aeed2c10d2934dd12c763e3f30621872ebe328e56a0c8e1" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.750862 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gsvrn"] Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.753341 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gsvrn"] Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.767523 4801 scope.go:117] "RemoveContainer" containerID="883a3bf9cb6cf6f62b1ac8de82d3f965aeb10c61f523ee66899cc19f1e050f62" Jan 22 14:07:48 crc kubenswrapper[4801]: I0122 14:07:48.930043 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:07:49 crc kubenswrapper[4801]: I0122 14:07:49.108801 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:07:49 crc kubenswrapper[4801]: I0122 14:07:49.109139 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:07:49 crc kubenswrapper[4801]: I0122 14:07:49.160591 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:07:49 crc kubenswrapper[4801]: I0122 14:07:49.578720 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" path="/var/lib/kubelet/pods/181b9b48-fc92-4b6d-b7e7-a61fb42b2024/volumes" Jan 22 14:07:49 crc kubenswrapper[4801]: I0122 14:07:49.580155 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" path="/var/lib/kubelet/pods/8fad3b3a-1004-410f-9058-bddf49f4be78/volumes" Jan 22 14:07:49 crc kubenswrapper[4801]: I0122 14:07:49.747038 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:07:51 crc kubenswrapper[4801]: I0122 14:07:51.404290 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:07:51 crc kubenswrapper[4801]: I0122 14:07:51.404338 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:07:51 crc kubenswrapper[4801]: I0122 14:07:51.449191 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:07:51 crc kubenswrapper[4801]: I0122 14:07:51.758048 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:07:54 crc kubenswrapper[4801]: I0122 14:07:54.807569 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skqvg"] Jan 22 14:07:54 crc kubenswrapper[4801]: I0122 14:07:54.808136 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-skqvg" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="registry-server" containerID="cri-o://119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e" gracePeriod=2 Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.140229 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.242043 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbghp\" (UniqueName: \"kubernetes.io/projected/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-kube-api-access-tbghp\") pod \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.242125 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-utilities\") pod \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.242158 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-catalog-content\") pod \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\" (UID: \"c9a37ba1-7039-4287-a2ae-3ba3d028dc96\") " Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.244422 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-utilities" (OuterVolumeSpecName: "utilities") pod "c9a37ba1-7039-4287-a2ae-3ba3d028dc96" (UID: "c9a37ba1-7039-4287-a2ae-3ba3d028dc96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.249686 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-kube-api-access-tbghp" (OuterVolumeSpecName: "kube-api-access-tbghp") pod "c9a37ba1-7039-4287-a2ae-3ba3d028dc96" (UID: "c9a37ba1-7039-4287-a2ae-3ba3d028dc96"). InnerVolumeSpecName "kube-api-access-tbghp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.263857 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9a37ba1-7039-4287-a2ae-3ba3d028dc96" (UID: "c9a37ba1-7039-4287-a2ae-3ba3d028dc96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.342821 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbghp\" (UniqueName: \"kubernetes.io/projected/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-kube-api-access-tbghp\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.342855 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.342874 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a37ba1-7039-4287-a2ae-3ba3d028dc96-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.734892 4801 generic.go:334] "Generic (PLEG): container finished" podID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerID="119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e" exitCode=0 Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.734933 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerDied","Data":"119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e"} Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.734959 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skqvg" event={"ID":"c9a37ba1-7039-4287-a2ae-3ba3d028dc96","Type":"ContainerDied","Data":"5aa17bd75fbe663554b4c03f4708094e97e603d0d7d0b79f0ec71a54af1082eb"} Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.734959 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skqvg" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.734977 4801 scope.go:117] "RemoveContainer" containerID="119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.751128 4801 scope.go:117] "RemoveContainer" containerID="1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.753382 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skqvg"] Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.756892 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-skqvg"] Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.763574 4801 scope.go:117] "RemoveContainer" containerID="7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.778115 4801 scope.go:117] "RemoveContainer" containerID="119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e" Jan 22 14:07:55 crc kubenswrapper[4801]: E0122 14:07:55.778460 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e\": container with ID starting with 119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e not found: ID does not exist" containerID="119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.778508 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e"} err="failed to get container status \"119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e\": rpc error: code = NotFound desc = could not find container \"119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e\": container with ID starting with 119d8e362e9a2c8a623e57b93fdc194084613ffd670f724a43c2bb9b7d3e717e not found: ID does not exist" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.778537 4801 scope.go:117] "RemoveContainer" containerID="1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4" Jan 22 14:07:55 crc kubenswrapper[4801]: E0122 14:07:55.778880 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4\": container with ID starting with 1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4 not found: ID does not exist" containerID="1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.778902 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4"} err="failed to get container status \"1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4\": rpc error: code = NotFound desc = could not find container \"1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4\": container with ID starting with 1c626403c43b472e90adf13be46de42bd833e84be82c6fe4c8d2a7ed45f4dec4 not found: ID does not exist" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.778918 4801 scope.go:117] "RemoveContainer" containerID="7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016" Jan 22 14:07:55 crc kubenswrapper[4801]: E0122 14:07:55.779145 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016\": container with ID starting with 7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016 not found: ID does not exist" containerID="7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016" Jan 22 14:07:55 crc kubenswrapper[4801]: I0122 14:07:55.779237 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016"} err="failed to get container status \"7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016\": rpc error: code = NotFound desc = could not find container \"7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016\": container with ID starting with 7d99d4a5751db0f83e7977894ac8a154f6ede792ae84030fe46498f5b9904016 not found: ID does not exist" Jan 22 14:07:57 crc kubenswrapper[4801]: I0122 14:07:57.587829 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" path="/var/lib/kubelet/pods/c9a37ba1-7039-4287-a2ae-3ba3d028dc96/volumes" Jan 22 14:07:59 crc kubenswrapper[4801]: I0122 14:07:59.687436 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zxs7c"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.549563 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drqnv"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.550240 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-drqnv" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="registry-server" containerID="cri-o://c87a2ab20204af16a6d81b2c5214858e5f3286e56ddf9ade8fe4a37d6cf0c86b" gracePeriod=30 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.551819 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87tcn"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.552031 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-87tcn" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="registry-server" containerID="cri-o://68b378ee9c21fcb9eeef765fabad48c3b0a035104b0baabbd4b43e97d71ef58e" gracePeriod=30 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.566182 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mg4vw"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.566376 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" containerID="cri-o://9691f1f23d44d28dcf6f1a49b91033239fe920cb5b4614c27ad9c936772b5e5e" gracePeriod=30 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.582230 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkd9x"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.582883 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nkd9x" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="registry-server" containerID="cri-o://6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928" gracePeriod=30 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585537 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k78hx"] Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585819 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585841 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585856 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585865 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585873 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585882 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585890 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585897 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585908 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585915 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585929 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585936 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585946 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c2a742-83c5-44eb-9470-7b802883f5b6" containerName="pruner" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585953 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c2a742-83c5-44eb-9470-7b802883f5b6" containerName="pruner" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585963 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585970 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.585981 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.585988 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.586000 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586007 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="extract-utilities" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.586018 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586026 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.586036 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586044 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.586055 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586062 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="extract-content" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586193 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a37ba1-7039-4287-a2ae-3ba3d028dc96" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586205 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fad3b3a-1004-410f-9058-bddf49f4be78" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586217 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c2a742-83c5-44eb-9470-7b802883f5b6" containerName="pruner" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586229 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2e297a-8bf5-447a-97e5-2d7cff43ed0c" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586240 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="181b9b48-fc92-4b6d-b7e7-a61fb42b2024" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.586737 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.591066 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tk862"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.591284 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tk862" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="registry-server" containerID="cri-o://fc56c87d6f7ca7e129f80456c71d84b31ae945343cbdb23d79a284b87d31c230" gracePeriod=30 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.596520 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k78hx"] Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.706662 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.706781 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqww\" (UniqueName: \"kubernetes.io/projected/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-kube-api-access-7qqww\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.706823 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.760297 4801 generic.go:334] "Generic (PLEG): container finished" podID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerID="fc56c87d6f7ca7e129f80456c71d84b31ae945343cbdb23d79a284b87d31c230" exitCode=0 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.760358 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk862" event={"ID":"492ffad8-460b-46f1-b566-9e1cce5cbcb0","Type":"ContainerDied","Data":"fc56c87d6f7ca7e129f80456c71d84b31ae945343cbdb23d79a284b87d31c230"} Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.761512 4801 generic.go:334] "Generic (PLEG): container finished" podID="401b3157-b40f-489c-827f-d4f941e96001" containerID="9691f1f23d44d28dcf6f1a49b91033239fe920cb5b4614c27ad9c936772b5e5e" exitCode=0 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.761607 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" event={"ID":"401b3157-b40f-489c-827f-d4f941e96001","Type":"ContainerDied","Data":"9691f1f23d44d28dcf6f1a49b91033239fe920cb5b4614c27ad9c936772b5e5e"} Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.763748 4801 generic.go:334] "Generic (PLEG): container finished" podID="134f3572-e2f4-4817-93c5-e93c68b761de" containerID="6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928" exitCode=0 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.763797 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerDied","Data":"6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928"} Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.765276 4801 generic.go:334] "Generic (PLEG): container finished" podID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerID="c87a2ab20204af16a6d81b2c5214858e5f3286e56ddf9ade8fe4a37d6cf0c86b" exitCode=0 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.765319 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drqnv" event={"ID":"ca3c0e1d-7cb5-400b-8c40-10fe54d57224","Type":"ContainerDied","Data":"c87a2ab20204af16a6d81b2c5214858e5f3286e56ddf9ade8fe4a37d6cf0c86b"} Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.766736 4801 generic.go:334] "Generic (PLEG): container finished" podID="5544e90b-4784-48bb-9462-5562be0bc923" containerID="68b378ee9c21fcb9eeef765fabad48c3b0a035104b0baabbd4b43e97d71ef58e" exitCode=0 Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.766757 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerDied","Data":"68b378ee9c21fcb9eeef765fabad48c3b0a035104b0baabbd4b43e97d71ef58e"} Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.808625 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqww\" (UniqueName: \"kubernetes.io/projected/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-kube-api-access-7qqww\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.808716 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.808830 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.813426 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.818735 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.827012 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqww\" (UniqueName: \"kubernetes.io/projected/1d5c5ddb-bf6a-4b15-8171-7cf71089d411-kube-api-access-7qqww\") pod \"marketplace-operator-79b997595-k78hx\" (UID: \"1d5c5ddb-bf6a-4b15-8171-7cf71089d411\") " pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.912657 4801 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928 is running failed: container process not found" containerID="6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928" cmd=["grpc_health_probe","-addr=:50051"] Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.913009 4801 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928 is running failed: container process not found" containerID="6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928" cmd=["grpc_health_probe","-addr=:50051"] Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.913354 4801 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928 is running failed: container process not found" containerID="6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928" cmd=["grpc_health_probe","-addr=:50051"] Jan 22 14:08:00 crc kubenswrapper[4801]: E0122 14:08:00.913467 4801 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-nkd9x" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="registry-server" Jan 22 14:08:00 crc kubenswrapper[4801]: I0122 14:08:00.915295 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.043268 4801 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mg4vw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.043605 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.312766 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k78hx"] Jan 22 14:08:01 crc kubenswrapper[4801]: W0122 14:08:01.324161 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d5c5ddb_bf6a_4b15_8171_7cf71089d411.slice/crio-2dcee745ae41600c6fa2484ca8d88842a54ea963083031cbd8e58f9d146e3d7d WatchSource:0}: Error finding container 2dcee745ae41600c6fa2484ca8d88842a54ea963083031cbd8e58f9d146e3d7d: Status 404 returned error can't find the container with id 2dcee745ae41600c6fa2484ca8d88842a54ea963083031cbd8e58f9d146e3d7d Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.690979 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.775905 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" event={"ID":"1d5c5ddb-bf6a-4b15-8171-7cf71089d411","Type":"ContainerStarted","Data":"5ed21d16e2be98fa58d4e5df43fca787268a6b098887c519b2543833143b3fcc"} Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.775982 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" event={"ID":"1d5c5ddb-bf6a-4b15-8171-7cf71089d411","Type":"ContainerStarted","Data":"2dcee745ae41600c6fa2484ca8d88842a54ea963083031cbd8e58f9d146e3d7d"} Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.776263 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.780355 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkd9x" event={"ID":"134f3572-e2f4-4817-93c5-e93c68b761de","Type":"ContainerDied","Data":"bb60979a1dff206b06e5af275da6472950331df67f6302d30f85b8fe9d38973d"} Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.780388 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb60979a1dff206b06e5af275da6472950331df67f6302d30f85b8fe9d38973d" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.780498 4801 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k78hx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.780525 4801 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.783741 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87tcn" event={"ID":"5544e90b-4784-48bb-9462-5562be0bc923","Type":"ContainerDied","Data":"eb1687fd83a584634befbf1599e688969001e3730f71dd393b2d68431c2db316"} Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.783778 4801 scope.go:117] "RemoveContainer" containerID="68b378ee9c21fcb9eeef765fabad48c3b0a035104b0baabbd4b43e97d71ef58e" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.783950 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87tcn" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.788381 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk862" event={"ID":"492ffad8-460b-46f1-b566-9e1cce5cbcb0","Type":"ContainerDied","Data":"53dac857eaf59bdcd489024087cbebb9141b1f114457eb8a9b83b8c39a8e43d0"} Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.788412 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53dac857eaf59bdcd489024087cbebb9141b1f114457eb8a9b83b8c39a8e43d0" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.790493 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" event={"ID":"401b3157-b40f-489c-827f-d4f941e96001","Type":"ContainerDied","Data":"61ae00462826d359b1f89a1db0508f68673f85c1b2401a2b4118c072525356b2"} Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.790525 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61ae00462826d359b1f89a1db0508f68673f85c1b2401a2b4118c072525356b2" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.798410 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" podStartSLOduration=1.798370249 podStartE2EDuration="1.798370249s" podCreationTimestamp="2026-01-22 14:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:01.794972382 +0000 UTC m=+230.496872565" watchObservedRunningTime="2026-01-22 14:08:01.798370249 +0000 UTC m=+230.500270422" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.798728 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.800366 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.801766 4801 scope.go:117] "RemoveContainer" containerID="e60b5c8afcae40170411fcae6badef3b2d427d549604433a2e0eac6787d33963" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.806040 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.818042 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxt5r\" (UniqueName: \"kubernetes.io/projected/5544e90b-4784-48bb-9462-5562be0bc923-kube-api-access-cxt5r\") pod \"5544e90b-4784-48bb-9462-5562be0bc923\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.818127 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-utilities\") pod \"5544e90b-4784-48bb-9462-5562be0bc923\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.818154 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-catalog-content\") pod \"5544e90b-4784-48bb-9462-5562be0bc923\" (UID: \"5544e90b-4784-48bb-9462-5562be0bc923\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.826609 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-utilities" (OuterVolumeSpecName: "utilities") pod "5544e90b-4784-48bb-9462-5562be0bc923" (UID: "5544e90b-4784-48bb-9462-5562be0bc923"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.829487 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5544e90b-4784-48bb-9462-5562be0bc923-kube-api-access-cxt5r" (OuterVolumeSpecName: "kube-api-access-cxt5r") pod "5544e90b-4784-48bb-9462-5562be0bc923" (UID: "5544e90b-4784-48bb-9462-5562be0bc923"). InnerVolumeSpecName "kube-api-access-cxt5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.851168 4801 scope.go:117] "RemoveContainer" containerID="053a852405672299475bd86f32a7f7a203f6a48466b41949a2ceec1b20def896" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.873372 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.919905 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-catalog-content\") pod \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.919955 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljkk2\" (UniqueName: \"kubernetes.io/projected/492ffad8-460b-46f1-b566-9e1cce5cbcb0-kube-api-access-ljkk2\") pod \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.919988 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-utilities\") pod \"134f3572-e2f4-4817-93c5-e93c68b761de\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920042 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca\") pod \"401b3157-b40f-489c-827f-d4f941e96001\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920070 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics\") pod \"401b3157-b40f-489c-827f-d4f941e96001\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920103 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zwnb\" (UniqueName: \"kubernetes.io/projected/401b3157-b40f-489c-827f-d4f941e96001-kube-api-access-4zwnb\") pod \"401b3157-b40f-489c-827f-d4f941e96001\" (UID: \"401b3157-b40f-489c-827f-d4f941e96001\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920143 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-utilities\") pod \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\" (UID: \"492ffad8-460b-46f1-b566-9e1cce5cbcb0\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920161 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-catalog-content\") pod \"134f3572-e2f4-4817-93c5-e93c68b761de\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920187 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lkw9\" (UniqueName: \"kubernetes.io/projected/134f3572-e2f4-4817-93c5-e93c68b761de-kube-api-access-2lkw9\") pod \"134f3572-e2f4-4817-93c5-e93c68b761de\" (UID: \"134f3572-e2f4-4817-93c5-e93c68b761de\") " Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920437 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxt5r\" (UniqueName: \"kubernetes.io/projected/5544e90b-4784-48bb-9462-5562be0bc923-kube-api-access-cxt5r\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.920467 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.922051 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-utilities" (OuterVolumeSpecName: "utilities") pod "492ffad8-460b-46f1-b566-9e1cce5cbcb0" (UID: "492ffad8-460b-46f1-b566-9e1cce5cbcb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.922541 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "401b3157-b40f-489c-827f-d4f941e96001" (UID: "401b3157-b40f-489c-827f-d4f941e96001"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.923483 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-utilities" (OuterVolumeSpecName: "utilities") pod "134f3572-e2f4-4817-93c5-e93c68b761de" (UID: "134f3572-e2f4-4817-93c5-e93c68b761de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.924322 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134f3572-e2f4-4817-93c5-e93c68b761de-kube-api-access-2lkw9" (OuterVolumeSpecName: "kube-api-access-2lkw9") pod "134f3572-e2f4-4817-93c5-e93c68b761de" (UID: "134f3572-e2f4-4817-93c5-e93c68b761de"). InnerVolumeSpecName "kube-api-access-2lkw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.925593 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492ffad8-460b-46f1-b566-9e1cce5cbcb0-kube-api-access-ljkk2" (OuterVolumeSpecName: "kube-api-access-ljkk2") pod "492ffad8-460b-46f1-b566-9e1cce5cbcb0" (UID: "492ffad8-460b-46f1-b566-9e1cce5cbcb0"). InnerVolumeSpecName "kube-api-access-ljkk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.925718 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "401b3157-b40f-489c-827f-d4f941e96001" (UID: "401b3157-b40f-489c-827f-d4f941e96001"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.934790 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401b3157-b40f-489c-827f-d4f941e96001-kube-api-access-4zwnb" (OuterVolumeSpecName: "kube-api-access-4zwnb") pod "401b3157-b40f-489c-827f-d4f941e96001" (UID: "401b3157-b40f-489c-827f-d4f941e96001"). InnerVolumeSpecName "kube-api-access-4zwnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.952152 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "134f3572-e2f4-4817-93c5-e93c68b761de" (UID: "134f3572-e2f4-4817-93c5-e93c68b761de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:01 crc kubenswrapper[4801]: I0122 14:08:01.957193 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5544e90b-4784-48bb-9462-5562be0bc923" (UID: "5544e90b-4784-48bb-9462-5562be0bc923"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021138 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn8tl\" (UniqueName: \"kubernetes.io/projected/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-kube-api-access-dn8tl\") pod \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021191 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-catalog-content\") pod \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021210 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-utilities\") pod \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\" (UID: \"ca3c0e1d-7cb5-400b-8c40-10fe54d57224\") " Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021378 4801 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401b3157-b40f-489c-827f-d4f941e96001-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021389 4801 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/401b3157-b40f-489c-827f-d4f941e96001-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021398 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zwnb\" (UniqueName: \"kubernetes.io/projected/401b3157-b40f-489c-827f-d4f941e96001-kube-api-access-4zwnb\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021407 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021416 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021423 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lkw9\" (UniqueName: \"kubernetes.io/projected/134f3572-e2f4-4817-93c5-e93c68b761de-kube-api-access-2lkw9\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021431 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljkk2\" (UniqueName: \"kubernetes.io/projected/492ffad8-460b-46f1-b566-9e1cce5cbcb0-kube-api-access-ljkk2\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021441 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/134f3572-e2f4-4817-93c5-e93c68b761de-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.021470 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5544e90b-4784-48bb-9462-5562be0bc923-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.022437 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-utilities" (OuterVolumeSpecName: "utilities") pod "ca3c0e1d-7cb5-400b-8c40-10fe54d57224" (UID: "ca3c0e1d-7cb5-400b-8c40-10fe54d57224"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.024683 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-kube-api-access-dn8tl" (OuterVolumeSpecName: "kube-api-access-dn8tl") pod "ca3c0e1d-7cb5-400b-8c40-10fe54d57224" (UID: "ca3c0e1d-7cb5-400b-8c40-10fe54d57224"). InnerVolumeSpecName "kube-api-access-dn8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.055410 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "492ffad8-460b-46f1-b566-9e1cce5cbcb0" (UID: "492ffad8-460b-46f1-b566-9e1cce5cbcb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.068057 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca3c0e1d-7cb5-400b-8c40-10fe54d57224" (UID: "ca3c0e1d-7cb5-400b-8c40-10fe54d57224"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.113222 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87tcn"] Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.115375 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-87tcn"] Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.122062 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492ffad8-460b-46f1-b566-9e1cce5cbcb0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.122288 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn8tl\" (UniqueName: \"kubernetes.io/projected/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-kube-api-access-dn8tl\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.122298 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.122306 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca3c0e1d-7cb5-400b-8c40-10fe54d57224-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.498525 4801 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.498799 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c" gracePeriod=15 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.498938 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07" gracePeriod=15 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.498980 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc" gracePeriod=15 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.499011 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4" gracePeriod=15 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.499037 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77" gracePeriod=15 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502185 4801 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502417 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502431 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502705 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502727 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502737 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502745 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502755 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502763 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502775 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502783 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502793 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502802 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502815 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502823 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502833 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502841 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502853 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502860 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502871 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502879 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502888 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502896 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502907 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502915 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502925 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502934 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502945 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502953 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502969 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502977 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.502988 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.502997 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.503008 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.503017 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.503029 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.503037 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="extract-content" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.503048 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.503056 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="extract-utilities" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.505624 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.505649 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.505661 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.505674 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="401b3157-b40f-489c-827f-d4f941e96001" containerName="marketplace-operator" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506442 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="5544e90b-4784-48bb-9462-5562be0bc923" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506483 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506504 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506516 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506528 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506540 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.506551 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" containerName="registry-server" Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.507682 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.507704 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.509581 4801 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.510104 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.513619 4801 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.549630 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.628262 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.628518 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.628626 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.628701 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.628793 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.628912 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.629093 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.629227 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730025 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730122 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730079 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730186 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730205 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730232 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730277 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730288 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730320 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730380 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730351 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730422 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730518 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730674 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730867 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.730925 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.796495 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drqnv" event={"ID":"ca3c0e1d-7cb5-400b-8c40-10fe54d57224","Type":"ContainerDied","Data":"1fd61157a6df2aebd941bfae9f23f2ce6b5fc6d789e4fabf14c2ea8222c0bf92"} Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.796557 4801 scope.go:117] "RemoveContainer" containerID="c87a2ab20204af16a6d81b2c5214858e5f3286e56ddf9ade8fe4a37d6cf0c86b" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.796650 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drqnv" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.801228 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.801712 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.805536 4801 generic.go:334] "Generic (PLEG): container finished" podID="bbee368a-1c06-459f-b5dc-44269645b020" containerID="ae96634dab70ce7eb95676e01d086683b3dd55811bdf1d0b992213db7b80b357" exitCode=0 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.805622 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbee368a-1c06-459f-b5dc-44269645b020","Type":"ContainerDied","Data":"ae96634dab70ce7eb95676e01d086683b3dd55811bdf1d0b992213db7b80b357"} Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.806238 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.806494 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.806765 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.808876 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.810134 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.810288 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.810704 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.811071 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.811191 4801 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07" exitCode=0 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.811230 4801 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc" exitCode=0 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.811242 4801 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4" exitCode=0 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.811254 4801 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77" exitCode=2 Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.811388 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.812821 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk862" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.815691 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkd9x" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.816158 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.816318 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.816369 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.816969 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.817318 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.818393 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.819166 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.819592 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.819847 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.820084 4801 scope.go:117] "RemoveContainer" containerID="cf6ad6da154a5a2bf22ccb93f68b1eaa98ae63d01a4906380984c35662cabac2" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.820176 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.820375 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.820624 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.820814 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.821032 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.846009 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.849950 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.850315 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.850671 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.850977 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.852894 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.853124 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.855610 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.856408 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.856932 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.857936 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.858209 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.858506 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.858733 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.858963 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.866006 4801 scope.go:117] "RemoveContainer" containerID="43e0a286c4ed02ffe0ea28c3f5d04e5cf6392f4755bac7623de0ea9dd999c243" Jan 22 14:08:02 crc kubenswrapper[4801]: I0122 14:08:02.889224 4801 scope.go:117] "RemoveContainer" containerID="244d3f8c5d3c565fe9081091388d64645b3037f1d21906401acd03765e410367" Jan 22 14:08:02 crc kubenswrapper[4801]: W0122 14:08:02.899185 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-59ccc599f4e14ae96c42137d805bbef21aa3d41d33e99523fb27faedd12415d1 WatchSource:0}: Error finding container 59ccc599f4e14ae96c42137d805bbef21aa3d41d33e99523fb27faedd12415d1: Status 404 returned error can't find the container with id 59ccc599f4e14ae96c42137d805bbef21aa3d41d33e99523fb27faedd12415d1 Jan 22 14:08:02 crc kubenswrapper[4801]: E0122 14:08:02.903143 4801 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.86:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d12c8f2c1452a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 14:08:02.901984554 +0000 UTC m=+231.603884737,LastTimestamp:2026-01-22 14:08:02.901984554 +0000 UTC m=+231.603884737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.579864 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5544e90b-4784-48bb-9462-5562be0bc923" path="/var/lib/kubelet/pods/5544e90b-4784-48bb-9462-5562be0bc923/volumes" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.819085 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45"} Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.819141 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"59ccc599f4e14ae96c42137d805bbef21aa3d41d33e99523fb27faedd12415d1"} Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.820688 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.821187 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.821609 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.822022 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.822401 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.822780 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.822900 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 14:08:03 crc kubenswrapper[4801]: I0122 14:08:03.823195 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.102043 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.102491 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.102653 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.102906 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.103357 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.103747 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.104068 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.104594 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.252721 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-var-lock\") pod \"bbee368a-1c06-459f-b5dc-44269645b020\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.253099 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbee368a-1c06-459f-b5dc-44269645b020-kube-api-access\") pod \"bbee368a-1c06-459f-b5dc-44269645b020\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.252965 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-var-lock" (OuterVolumeSpecName: "var-lock") pod "bbee368a-1c06-459f-b5dc-44269645b020" (UID: "bbee368a-1c06-459f-b5dc-44269645b020"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.253212 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-kubelet-dir\") pod \"bbee368a-1c06-459f-b5dc-44269645b020\" (UID: \"bbee368a-1c06-459f-b5dc-44269645b020\") " Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.253486 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bbee368a-1c06-459f-b5dc-44269645b020" (UID: "bbee368a-1c06-459f-b5dc-44269645b020"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.253586 4801 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-var-lock\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.260258 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbee368a-1c06-459f-b5dc-44269645b020-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bbee368a-1c06-459f-b5dc-44269645b020" (UID: "bbee368a-1c06-459f-b5dc-44269645b020"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.355226 4801 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbee368a-1c06-459f-b5dc-44269645b020-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.355270 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbee368a-1c06-459f-b5dc-44269645b020-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.830929 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.831489 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bbee368a-1c06-459f-b5dc-44269645b020","Type":"ContainerDied","Data":"50ec26ecdbd1bb3ab3c238f81bfec031547a5ddcc4de12cab9b20e9ea48a1681"} Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.831527 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50ec26ecdbd1bb3ab3c238f81bfec031547a5ddcc4de12cab9b20e9ea48a1681" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.843205 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.843591 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.843887 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.844078 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.844225 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.844365 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.844522 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: E0122 14:08:04.968222 4801 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: E0122 14:08:04.968910 4801 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: E0122 14:08:04.969175 4801 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: E0122 14:08:04.969388 4801 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: E0122 14:08:04.969686 4801 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:04 crc kubenswrapper[4801]: I0122 14:08:04.969745 4801 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 22 14:08:04 crc kubenswrapper[4801]: E0122 14:08:04.970110 4801 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" interval="200ms" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.171553 4801 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" interval="400ms" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.399548 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.400852 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.401578 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.402117 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.402705 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.403181 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.403639 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.404161 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.404674 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.405132 4801 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.572779 4801 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" interval="800ms" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579338 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579406 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579436 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579457 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579576 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579593 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579801 4801 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579825 4801 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.579839 4801 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.584241 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.745217 4801 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.86:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d12c8f2c1452a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 14:08:02.901984554 +0000 UTC m=+231.603884737,LastTimestamp:2026-01-22 14:08:02.901984554 +0000 UTC m=+231.603884737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.842430 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.844001 4801 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c" exitCode=0 Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.844065 4801 scope.go:117] "RemoveContainer" containerID="6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.844124 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.845911 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.846570 4801 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.846851 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.847324 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.847864 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.848365 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.848868 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.849158 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.849762 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.850088 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.850338 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.850621 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.850972 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.851324 4801 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.851660 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.851833 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.867028 4801 scope.go:117] "RemoveContainer" containerID="7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.882290 4801 scope.go:117] "RemoveContainer" containerID="7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.896930 4801 scope.go:117] "RemoveContainer" containerID="a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.914620 4801 scope.go:117] "RemoveContainer" containerID="85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.933896 4801 scope.go:117] "RemoveContainer" containerID="cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.956119 4801 scope.go:117] "RemoveContainer" containerID="6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.956648 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\": container with ID starting with 6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07 not found: ID does not exist" containerID="6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.956685 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07"} err="failed to get container status \"6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\": rpc error: code = NotFound desc = could not find container \"6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07\": container with ID starting with 6f171562f5c3c41dd24955d4c8a5105a6db5958ba94727542cd5de1002007b07 not found: ID does not exist" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.956711 4801 scope.go:117] "RemoveContainer" containerID="7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.956974 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\": container with ID starting with 7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc not found: ID does not exist" containerID="7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.956994 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc"} err="failed to get container status \"7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\": rpc error: code = NotFound desc = could not find container \"7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc\": container with ID starting with 7bdb4238e8ad095b407ef426164230bd51146c0c77c3d9d773cee1b2615710fc not found: ID does not exist" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.957010 4801 scope.go:117] "RemoveContainer" containerID="7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.957279 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\": container with ID starting with 7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4 not found: ID does not exist" containerID="7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.957299 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4"} err="failed to get container status \"7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\": rpc error: code = NotFound desc = could not find container \"7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4\": container with ID starting with 7325ba95f9c0c031496fada71672af107ccf744c540eea9b75e8636f28f5a1a4 not found: ID does not exist" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.957314 4801 scope.go:117] "RemoveContainer" containerID="a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.957499 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\": container with ID starting with a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77 not found: ID does not exist" containerID="a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.957521 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77"} err="failed to get container status \"a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\": rpc error: code = NotFound desc = could not find container \"a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77\": container with ID starting with a5771467c301a131ad25df9775f73d8e4cc16367a87ffcc0052a958194d4ea77 not found: ID does not exist" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.957537 4801 scope.go:117] "RemoveContainer" containerID="85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.958118 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\": container with ID starting with 85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c not found: ID does not exist" containerID="85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.958144 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c"} err="failed to get container status \"85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\": rpc error: code = NotFound desc = could not find container \"85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c\": container with ID starting with 85da4a2168b7f70b2d97f6c95bdf0c58c2d152e0eb2eb781155c9cdb8e3da38c not found: ID does not exist" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.958163 4801 scope.go:117] "RemoveContainer" containerID="cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d" Jan 22 14:08:05 crc kubenswrapper[4801]: E0122 14:08:05.958512 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\": container with ID starting with cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d not found: ID does not exist" containerID="cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d" Jan 22 14:08:05 crc kubenswrapper[4801]: I0122 14:08:05.958551 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d"} err="failed to get container status \"cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\": rpc error: code = NotFound desc = could not find container \"cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d\": container with ID starting with cb38c55e78ab0e017e99d2b64cedd4396ccb11db26eb57f13e822229eaef0c0d not found: ID does not exist" Jan 22 14:08:06 crc kubenswrapper[4801]: E0122 14:08:06.374775 4801 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" interval="1.6s" Jan 22 14:08:07 crc kubenswrapper[4801]: E0122 14:08:07.975790 4801 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" interval="3.2s" Jan 22 14:08:09 crc kubenswrapper[4801]: E0122 14:08:09.655793 4801 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.86:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" volumeName="registry-storage" Jan 22 14:08:11 crc kubenswrapper[4801]: E0122 14:08:11.177300 4801 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.86:6443: connect: connection refused" interval="6.4s" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.577136 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.577513 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.577977 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.578299 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.578676 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.579214 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:11 crc kubenswrapper[4801]: I0122 14:08:11.579648 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.570764 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.572266 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.572791 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.573128 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.573395 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.573676 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.573923 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.574154 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.588373 4801 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.588427 4801 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:15 crc kubenswrapper[4801]: E0122 14:08:15.589184 4801 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.589856 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:15 crc kubenswrapper[4801]: W0122 14:08:15.613210 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-fd3154988d882aa48cf820cec4c2a3008683b82c13050ecb00288054529a9901 WatchSource:0}: Error finding container fd3154988d882aa48cf820cec4c2a3008683b82c13050ecb00288054529a9901: Status 404 returned error can't find the container with id fd3154988d882aa48cf820cec4c2a3008683b82c13050ecb00288054529a9901 Jan 22 14:08:15 crc kubenswrapper[4801]: E0122 14:08:15.747112 4801 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.86:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d12c8f2c1452a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 14:08:02.901984554 +0000 UTC m=+231.603884737,LastTimestamp:2026-01-22 14:08:02.901984554 +0000 UTC m=+231.603884737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.910040 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fd3154988d882aa48cf820cec4c2a3008683b82c13050ecb00288054529a9901"} Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.913044 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.913092 4801 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04" exitCode=1 Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.913123 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04"} Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.913518 4801 scope.go:117] "RemoveContainer" containerID="3328cc6ce826e8a504af6daa13f570ecd3171ee252ff3980d5716f0c2ff04f04" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.913911 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.914565 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.914783 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.915122 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.915375 4801 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.915585 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.915810 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:15 crc kubenswrapper[4801]: I0122 14:08:15.916031 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.923701 4801 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1d4a27c5a561eca26a3f0adee47ea958af044be84f644f89ab46eaa8806889eb" exitCode=0 Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.923747 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1d4a27c5a561eca26a3f0adee47ea958af044be84f644f89ab46eaa8806889eb"} Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.924124 4801 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.924162 4801 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.924753 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: E0122 14:08:16.924921 4801 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.925240 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.925768 4801 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.926273 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.926806 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.927322 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.928195 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.928828 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.928898 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"deb882ebe2185b3954c11494951826c3aefeba58db6e347870c99122192e4b68"} Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.928896 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.929595 4801 status_manager.go:851] "Failed to get status for pod" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" pod="openshift-marketplace/redhat-operators-tk862" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-tk862\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.930051 4801 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.930520 4801 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.930891 4801 status_manager.go:851] "Failed to get status for pod" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" pod="openshift-marketplace/certified-operators-drqnv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-drqnv\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.931510 4801 status_manager.go:851] "Failed to get status for pod" podUID="1d5c5ddb-bf6a-4b15-8171-7cf71089d411" pod="openshift-marketplace/marketplace-operator-79b997595-k78hx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-k78hx\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.931826 4801 status_manager.go:851] "Failed to get status for pod" podUID="bbee368a-1c06-459f-b5dc-44269645b020" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.932215 4801 status_manager.go:851] "Failed to get status for pod" podUID="401b3157-b40f-489c-827f-d4f941e96001" pod="openshift-marketplace/marketplace-operator-79b997595-mg4vw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-mg4vw\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:16 crc kubenswrapper[4801]: I0122 14:08:16.932857 4801 status_manager.go:851] "Failed to get status for pod" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" pod="openshift-marketplace/redhat-marketplace-nkd9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nkd9x\": dial tcp 38.129.56.86:6443: connect: connection refused" Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938420 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"54a99eb3ab5d3dba4a57ca4c60d3304f3de637098dc3d8c414153e109a9ae7db"} Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938724 4801 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938748 4801 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938751 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"963fb55d0327645e6fcb3a8a2e570f644cb97b53e287b14850eef1a35577c3d9"} Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938763 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"716ca8010dda4db79c17952774502a053a7ca63d7700d581683ab497e90c3cf3"} Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938774 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"38721d9e386eb7421755494b0842dadb37e288a27aca1709718c126972c44a65"} Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938782 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4389c32c29adc52f68f20f465110a944014675c78ec7062d5e92ce29c77f3843"} Jan 22 14:08:17 crc kubenswrapper[4801]: I0122 14:08:17.938797 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:20 crc kubenswrapper[4801]: I0122 14:08:20.590612 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:20 crc kubenswrapper[4801]: I0122 14:08:20.590691 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:20 crc kubenswrapper[4801]: I0122 14:08:20.596688 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:23 crc kubenswrapper[4801]: I0122 14:08:23.697778 4801 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:23 crc kubenswrapper[4801]: I0122 14:08:23.837727 4801 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1cdfabad-c24c-4c34-be95-179705298060" Jan 22 14:08:23 crc kubenswrapper[4801]: I0122 14:08:23.969070 4801 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:23 crc kubenswrapper[4801]: I0122 14:08:23.969428 4801 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:23 crc kubenswrapper[4801]: I0122 14:08:23.971478 4801 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1cdfabad-c24c-4c34-be95-179705298060" Jan 22 14:08:24 crc kubenswrapper[4801]: I0122 14:08:24.189603 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 14:08:24 crc kubenswrapper[4801]: I0122 14:08:24.722722 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" podUID="7d2c98a4-f3a8-4200-8432-6f68459320ca" containerName="oauth-openshift" containerID="cri-o://19715c9607ebc5693073fa43c0f262316a7afd480e40c3fecb7a5aa8e4acebb6" gracePeriod=15 Jan 22 14:08:24 crc kubenswrapper[4801]: I0122 14:08:24.977468 4801 generic.go:334] "Generic (PLEG): container finished" podID="7d2c98a4-f3a8-4200-8432-6f68459320ca" containerID="19715c9607ebc5693073fa43c0f262316a7afd480e40c3fecb7a5aa8e4acebb6" exitCode=0 Jan 22 14:08:24 crc kubenswrapper[4801]: I0122 14:08:24.977506 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" event={"ID":"7d2c98a4-f3a8-4200-8432-6f68459320ca","Type":"ContainerDied","Data":"19715c9607ebc5693073fa43c0f262316a7afd480e40c3fecb7a5aa8e4acebb6"} Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.069810 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108626 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-idp-0-file-data\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108675 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-session\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108697 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr9z7\" (UniqueName: \"kubernetes.io/projected/7d2c98a4-f3a8-4200-8432-6f68459320ca-kube-api-access-pr9z7\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108727 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-login\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108756 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-service-ca\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108777 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-dir\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108802 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-ocp-branding-template\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108826 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-router-certs\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108864 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-provider-selection\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108889 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-serving-cert\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108920 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-error\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108939 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-policies\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.108973 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-cliconfig\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.109014 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-trusted-ca-bundle\") pod \"7d2c98a4-f3a8-4200-8432-6f68459320ca\" (UID: \"7d2c98a4-f3a8-4200-8432-6f68459320ca\") " Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.109374 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.110227 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.110217 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.110266 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.110703 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.115613 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.115799 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2c98a4-f3a8-4200-8432-6f68459320ca-kube-api-access-pr9z7" (OuterVolumeSpecName: "kube-api-access-pr9z7") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "kube-api-access-pr9z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.116691 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.117331 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.117491 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.117731 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.117814 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.117974 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.118088 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7d2c98a4-f3a8-4200-8432-6f68459320ca" (UID: "7d2c98a4-f3a8-4200-8432-6f68459320ca"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209526 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209792 4801 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209802 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209812 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209821 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209829 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209837 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr9z7\" (UniqueName: \"kubernetes.io/projected/7d2c98a4-f3a8-4200-8432-6f68459320ca-kube-api-access-pr9z7\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209846 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209854 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209862 4801 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d2c98a4-f3a8-4200-8432-6f68459320ca-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209870 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209879 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209891 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.209900 4801 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d2c98a4-f3a8-4200-8432-6f68459320ca-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.381172 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.385711 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.987622 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.987975 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zxs7c" event={"ID":"7d2c98a4-f3a8-4200-8432-6f68459320ca","Type":"ContainerDied","Data":"8f0f8c6b3f492cd73e02ba02e0cc4d3015f05b99f6b236dee54fb361caa9f569"} Jan 22 14:08:25 crc kubenswrapper[4801]: I0122 14:08:25.989063 4801 scope.go:117] "RemoveContainer" containerID="19715c9607ebc5693073fa43c0f262316a7afd480e40c3fecb7a5aa8e4acebb6" Jan 22 14:08:33 crc kubenswrapper[4801]: I0122 14:08:33.469429 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.053772 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.195021 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.421900 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.440776 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.513378 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.706252 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 22 14:08:34 crc kubenswrapper[4801]: I0122 14:08:34.783966 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.063212 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.100072 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.158812 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.399349 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.530685 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.761174 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.905531 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.922149 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.923805 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.925710 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 22 14:08:35 crc kubenswrapper[4801]: I0122 14:08:35.996593 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.276882 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.284222 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.568917 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.601394 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.614715 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.633040 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.718719 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.725784 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 22 14:08:36 crc kubenswrapper[4801]: I0122 14:08:36.832483 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.173503 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.205351 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.261856 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.415121 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.416645 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.460020 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.467085 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.549231 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.644664 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.677713 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.802189 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.807088 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.818531 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.820511 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 22 14:08:37 crc kubenswrapper[4801]: I0122 14:08:37.825753 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.036254 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.134037 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.360546 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.492466 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.495358 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.508628 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.541697 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.607748 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.634317 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.640031 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.661725 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.662415 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.666929 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.759368 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.840645 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.895040 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.905686 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.945649 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 22 14:08:38 crc kubenswrapper[4801]: I0122 14:08:38.980848 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.096239 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.153804 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.219133 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.275678 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.286693 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.302496 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.321353 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.359108 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.374108 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.402474 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.857848 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.858790 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.890522 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 22 14:08:39 crc kubenswrapper[4801]: I0122 14:08:39.890549 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.066395 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.145415 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.207957 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.263897 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.285591 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.355880 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.411108 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.478129 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.504470 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.538284 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.547708 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.576650 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.661896 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.678494 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.697575 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.713723 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.739158 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.814811 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.913363 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.930830 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.957329 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.967052 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 22 14:08:40 crc kubenswrapper[4801]: I0122 14:08:40.986817 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.008662 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.017234 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.017403 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.090351 4801 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.096569 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.152643 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.169651 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.229422 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.246730 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.266479 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.378232 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.410391 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.431995 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.451114 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.499727 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.512781 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.528049 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.529903 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.562610 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.596487 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.645398 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.717947 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.753576 4801 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.823717 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.826991 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.828973 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 22 14:08:41 crc kubenswrapper[4801]: I0122 14:08:41.855764 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.018923 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.093653 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.098890 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.136922 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.163112 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.233542 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.262799 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.263935 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.280076 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.293190 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.314683 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.330162 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.343820 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.613013 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.703421 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.822083 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.938577 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 22 14:08:42 crc kubenswrapper[4801]: I0122 14:08:42.987100 4801 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.005517 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.013861 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.104740 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.160119 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.250109 4801 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.251688 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.251668649 podStartE2EDuration="41.251668649s" podCreationTimestamp="2026-01-22 14:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:23.772744382 +0000 UTC m=+252.474644565" watchObservedRunningTime="2026-01-22 14:08:43.251668649 +0000 UTC m=+271.953568842" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.255229 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tk862","openshift-marketplace/marketplace-operator-79b997595-mg4vw","openshift-authentication/oauth-openshift-558db77b4-zxs7c","openshift-marketplace/redhat-marketplace-nkd9x","openshift-marketplace/certified-operators-drqnv","openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.255309 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-798f497965-98ltg","openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 14:08:43 crc kubenswrapper[4801]: E0122 14:08:43.255565 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbee368a-1c06-459f-b5dc-44269645b020" containerName="installer" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.255600 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbee368a-1c06-459f-b5dc-44269645b020" containerName="installer" Jan 22 14:08:43 crc kubenswrapper[4801]: E0122 14:08:43.255621 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2c98a4-f3a8-4200-8432-6f68459320ca" containerName="oauth-openshift" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.255630 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2c98a4-f3a8-4200-8432-6f68459320ca" containerName="oauth-openshift" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.255750 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbee368a-1c06-459f-b5dc-44269645b020" containerName="installer" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.255768 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2c98a4-f3a8-4200-8432-6f68459320ca" containerName="oauth-openshift" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.256077 4801 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.256116 4801 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bc7331d1-1b1f-44b3-b4be-83fd708d9c30" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.256144 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.258127 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.258287 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.258369 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.263103 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.263333 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.263481 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.263635 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.265648 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.265924 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.267397 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.267606 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.273677 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.280796 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.280863 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.282041 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.289506 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.295876 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.295857522 podStartE2EDuration="20.295857522s" podCreationTimestamp="2026-01-22 14:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:43.291402874 +0000 UTC m=+271.993303077" watchObservedRunningTime="2026-01-22 14:08:43.295857522 +0000 UTC m=+271.997757705" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.298429 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.301717 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.348405 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.410489 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.410529 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-login\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.410562 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-session\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.410584 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411221 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnkv\" (UniqueName: \"kubernetes.io/projected/ce802d78-24c3-402e-977e-d272a2b0ea28-kube-api-access-8qnkv\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411329 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-router-certs\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411413 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411519 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-error\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411614 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411699 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-service-ca\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411789 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-audit-policies\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411893 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce802d78-24c3-402e-977e-d272a2b0ea28-audit-dir\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.411970 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.412070 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.436944 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.455828 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.513852 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-session\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.514235 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.514397 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnkv\" (UniqueName: \"kubernetes.io/projected/ce802d78-24c3-402e-977e-d272a2b0ea28-kube-api-access-8qnkv\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.514570 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-router-certs\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.514806 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.514987 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-error\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.515157 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.515323 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-service-ca\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.515519 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-audit-policies\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.515732 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce802d78-24c3-402e-977e-d272a2b0ea28-audit-dir\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.515886 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.516024 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.516173 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.516322 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-login\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.516351 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce802d78-24c3-402e-977e-d272a2b0ea28-audit-dir\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.516860 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-service-ca\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.515178 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.517404 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-audit-policies\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.518404 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.521234 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.521512 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-router-certs\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.521568 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-session\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.521630 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-login\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.521785 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.521942 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-error\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.523181 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.529154 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ce802d78-24c3-402e-977e-d272a2b0ea28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.530638 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.532251 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnkv\" (UniqueName: \"kubernetes.io/projected/ce802d78-24c3-402e-977e-d272a2b0ea28-kube-api-access-8qnkv\") pod \"oauth-openshift-798f497965-98ltg\" (UID: \"ce802d78-24c3-402e-977e-d272a2b0ea28\") " pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.582596 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134f3572-e2f4-4817-93c5-e93c68b761de" path="/var/lib/kubelet/pods/134f3572-e2f4-4817-93c5-e93c68b761de/volumes" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.584089 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401b3157-b40f-489c-827f-d4f941e96001" path="/var/lib/kubelet/pods/401b3157-b40f-489c-827f-d4f941e96001/volumes" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.584860 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492ffad8-460b-46f1-b566-9e1cce5cbcb0" path="/var/lib/kubelet/pods/492ffad8-460b-46f1-b566-9e1cce5cbcb0/volumes" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.586619 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2c98a4-f3a8-4200-8432-6f68459320ca" path="/var/lib/kubelet/pods/7d2c98a4-f3a8-4200-8432-6f68459320ca/volumes" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.588365 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3c0e1d-7cb5-400b-8c40-10fe54d57224" path="/var/lib/kubelet/pods/ca3c0e1d-7cb5-400b-8c40-10fe54d57224/volumes" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.589474 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.670109 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.764767 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-798f497965-98ltg"] Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.790644 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.861299 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.874046 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.884476 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.889203 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.894300 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 22 14:08:43 crc kubenswrapper[4801]: I0122 14:08:43.926548 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.079327 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" event={"ID":"ce802d78-24c3-402e-977e-d272a2b0ea28","Type":"ContainerStarted","Data":"b4d9019ed58853b61493b2d0821023d146d7600b0d6680e28c7548974ace39d0"} Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.079377 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" event={"ID":"ce802d78-24c3-402e-977e-d272a2b0ea28","Type":"ContainerStarted","Data":"5ba0eaebe2046bd498ea719dbc1ff847761c89e77446e04435b007e8921fe33a"} Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.097766 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" podStartSLOduration=45.097743178 podStartE2EDuration="45.097743178s" podCreationTimestamp="2026-01-22 14:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:08:44.096876203 +0000 UTC m=+272.798776396" watchObservedRunningTime="2026-01-22 14:08:44.097743178 +0000 UTC m=+272.799643361" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.267990 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.299771 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.310492 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.317602 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.349518 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.572568 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.585606 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.588751 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.650052 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.662470 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.760708 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.784481 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.835119 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 22 14:08:44 crc kubenswrapper[4801]: I0122 14:08:44.904366 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.084180 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.089343 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-798f497965-98ltg" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.127328 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.131722 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.217910 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.342319 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.500512 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.550837 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.595568 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.649044 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.699233 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.702368 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.780423 4801 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.817950 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.874086 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.947820 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 22 14:08:45 crc kubenswrapper[4801]: I0122 14:08:45.953379 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.067879 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.344084 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.414229 4801 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.414441 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45" gracePeriod=5 Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.522851 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.523212 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.606532 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.707533 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.709842 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.722853 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.737339 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.749176 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.759246 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.889289 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.955004 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 14:08:46 crc kubenswrapper[4801]: I0122 14:08:46.983614 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.063481 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.101637 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.171427 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.186786 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.509347 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.947375 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 14:08:47 crc kubenswrapper[4801]: I0122 14:08:47.960814 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.012769 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.129029 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.138825 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.207316 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.229076 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.233167 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.364016 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.402470 4801 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.478991 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.493795 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.498784 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.543095 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.670437 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 22 14:08:48 crc kubenswrapper[4801]: I0122 14:08:48.700976 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 22 14:08:49 crc kubenswrapper[4801]: I0122 14:08:49.054462 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 22 14:08:49 crc kubenswrapper[4801]: I0122 14:08:49.081693 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 22 14:08:49 crc kubenswrapper[4801]: I0122 14:08:49.384132 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 22 14:08:49 crc kubenswrapper[4801]: I0122 14:08:49.496628 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 22 14:08:49 crc kubenswrapper[4801]: I0122 14:08:49.739810 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 22 14:08:49 crc kubenswrapper[4801]: I0122 14:08:49.775128 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 22 14:08:50 crc kubenswrapper[4801]: I0122 14:08:50.403924 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 22 14:08:51 crc kubenswrapper[4801]: I0122 14:08:51.334346 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:51.999964 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.000037 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115214 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115266 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115319 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115354 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115403 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115430 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115690 4801 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115741 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115782 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.115786 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.120957 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.121407 4801 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45" exitCode=137 Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.121505 4801 scope.go:117] "RemoveContainer" containerID="1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.121722 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.135896 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.169228 4801 scope.go:117] "RemoveContainer" containerID="1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45" Jan 22 14:08:52 crc kubenswrapper[4801]: E0122 14:08:52.169972 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45\": container with ID starting with 1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45 not found: ID does not exist" containerID="1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.170031 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45"} err="failed to get container status \"1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45\": rpc error: code = NotFound desc = could not find container \"1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45\": container with ID starting with 1ecfb44be15acb1043003040b4344ed7749bf2935e23054141b0613586868b45 not found: ID does not exist" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.217591 4801 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.217639 4801 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.217656 4801 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:52 crc kubenswrapper[4801]: I0122 14:08:52.217667 4801 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 22 14:08:53 crc kubenswrapper[4801]: I0122 14:08:53.577743 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 22 14:08:53 crc kubenswrapper[4801]: I0122 14:08:53.578093 4801 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 22 14:08:53 crc kubenswrapper[4801]: I0122 14:08:53.588111 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 14:08:53 crc kubenswrapper[4801]: I0122 14:08:53.588146 4801 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4e4bd214-0ae7-41ff-81cb-85d9c0fc9460" Jan 22 14:08:53 crc kubenswrapper[4801]: I0122 14:08:53.592683 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 14:08:53 crc kubenswrapper[4801]: I0122 14:08:53.592719 4801 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4e4bd214-0ae7-41ff-81cb-85d9c0fc9460" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.239260 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6ddc9"] Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.239977 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" podUID="32fa36ec-07ad-4ebe-8e79-3692f621cd37" containerName="controller-manager" containerID="cri-o://956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7" gracePeriod=30 Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.310197 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz"] Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.310524 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" podUID="5f171ae3-24fe-42e0-b8dd-0e733fc33381" containerName="route-controller-manager" containerID="cri-o://deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0" gracePeriod=30 Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.610960 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.667196 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.795949 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b94qp\" (UniqueName: \"kubernetes.io/projected/32fa36ec-07ad-4ebe-8e79-3692f621cd37-kube-api-access-b94qp\") pod \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.795996 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsms4\" (UniqueName: \"kubernetes.io/projected/5f171ae3-24fe-42e0-b8dd-0e733fc33381-kube-api-access-dsms4\") pod \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796025 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-proxy-ca-bundles\") pod \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796051 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fa36ec-07ad-4ebe-8e79-3692f621cd37-serving-cert\") pod \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796069 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-config\") pod \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796095 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-config\") pod \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796115 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f171ae3-24fe-42e0-b8dd-0e733fc33381-serving-cert\") pod \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796161 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-client-ca\") pod \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\" (UID: \"32fa36ec-07ad-4ebe-8e79-3692f621cd37\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796543 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "32fa36ec-07ad-4ebe-8e79-3692f621cd37" (UID: "32fa36ec-07ad-4ebe-8e79-3692f621cd37"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796816 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-client-ca" (OuterVolumeSpecName: "client-ca") pod "32fa36ec-07ad-4ebe-8e79-3692f621cd37" (UID: "32fa36ec-07ad-4ebe-8e79-3692f621cd37"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.796903 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-config" (OuterVolumeSpecName: "config") pod "32fa36ec-07ad-4ebe-8e79-3692f621cd37" (UID: "32fa36ec-07ad-4ebe-8e79-3692f621cd37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797143 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-config" (OuterVolumeSpecName: "config") pod "5f171ae3-24fe-42e0-b8dd-0e733fc33381" (UID: "5f171ae3-24fe-42e0-b8dd-0e733fc33381"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797210 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-client-ca\") pod \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\" (UID: \"5f171ae3-24fe-42e0-b8dd-0e733fc33381\") " Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797466 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797478 4801 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797489 4801 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797499 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32fa36ec-07ad-4ebe-8e79-3692f621cd37-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.797985 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-client-ca" (OuterVolumeSpecName: "client-ca") pod "5f171ae3-24fe-42e0-b8dd-0e733fc33381" (UID: "5f171ae3-24fe-42e0-b8dd-0e733fc33381"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.801427 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f171ae3-24fe-42e0-b8dd-0e733fc33381-kube-api-access-dsms4" (OuterVolumeSpecName: "kube-api-access-dsms4") pod "5f171ae3-24fe-42e0-b8dd-0e733fc33381" (UID: "5f171ae3-24fe-42e0-b8dd-0e733fc33381"). InnerVolumeSpecName "kube-api-access-dsms4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.802026 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fa36ec-07ad-4ebe-8e79-3692f621cd37-kube-api-access-b94qp" (OuterVolumeSpecName: "kube-api-access-b94qp") pod "32fa36ec-07ad-4ebe-8e79-3692f621cd37" (UID: "32fa36ec-07ad-4ebe-8e79-3692f621cd37"). InnerVolumeSpecName "kube-api-access-b94qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.802141 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f171ae3-24fe-42e0-b8dd-0e733fc33381-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5f171ae3-24fe-42e0-b8dd-0e733fc33381" (UID: "5f171ae3-24fe-42e0-b8dd-0e733fc33381"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.802196 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fa36ec-07ad-4ebe-8e79-3692f621cd37-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32fa36ec-07ad-4ebe-8e79-3692f621cd37" (UID: "32fa36ec-07ad-4ebe-8e79-3692f621cd37"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.898848 4801 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f171ae3-24fe-42e0-b8dd-0e733fc33381-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.898905 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsms4\" (UniqueName: \"kubernetes.io/projected/5f171ae3-24fe-42e0-b8dd-0e733fc33381-kube-api-access-dsms4\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.898927 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b94qp\" (UniqueName: \"kubernetes.io/projected/32fa36ec-07ad-4ebe-8e79-3692f621cd37-kube-api-access-b94qp\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.898947 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32fa36ec-07ad-4ebe-8e79-3692f621cd37-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:06 crc kubenswrapper[4801]: I0122 14:09:06.898964 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f171ae3-24fe-42e0-b8dd-0e733fc33381-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.211783 4801 generic.go:334] "Generic (PLEG): container finished" podID="5f171ae3-24fe-42e0-b8dd-0e733fc33381" containerID="deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0" exitCode=0 Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.211839 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.211875 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" event={"ID":"5f171ae3-24fe-42e0-b8dd-0e733fc33381","Type":"ContainerDied","Data":"deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0"} Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.212920 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz" event={"ID":"5f171ae3-24fe-42e0-b8dd-0e733fc33381","Type":"ContainerDied","Data":"76a71ae39948c7835c8292e766aae647ee0edb8b681e9e3e5140e03fe8910381"} Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.212987 4801 scope.go:117] "RemoveContainer" containerID="deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.214106 4801 generic.go:334] "Generic (PLEG): container finished" podID="32fa36ec-07ad-4ebe-8e79-3692f621cd37" containerID="956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7" exitCode=0 Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.214141 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.214156 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" event={"ID":"32fa36ec-07ad-4ebe-8e79-3692f621cd37","Type":"ContainerDied","Data":"956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7"} Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.214194 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6ddc9" event={"ID":"32fa36ec-07ad-4ebe-8e79-3692f621cd37","Type":"ContainerDied","Data":"3515190969f39c2495f6c9f06f4240129ab697aa8e14235afdf461c125e17b62"} Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.237029 4801 scope.go:117] "RemoveContainer" containerID="deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0" Jan 22 14:09:07 crc kubenswrapper[4801]: E0122 14:09:07.237665 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0\": container with ID starting with deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0 not found: ID does not exist" containerID="deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.237728 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0"} err="failed to get container status \"deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0\": rpc error: code = NotFound desc = could not find container \"deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0\": container with ID starting with deec2148fa6f7e17cbf50e1da254170212605163f70992dd0785dfc1491c85a0 not found: ID does not exist" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.237768 4801 scope.go:117] "RemoveContainer" containerID="956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.249933 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz"] Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.254349 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m4plz"] Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.257630 4801 scope.go:117] "RemoveContainer" containerID="956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7" Jan 22 14:09:07 crc kubenswrapper[4801]: E0122 14:09:07.261613 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7\": container with ID starting with 956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7 not found: ID does not exist" containerID="956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.261661 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7"} err="failed to get container status \"956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7\": rpc error: code = NotFound desc = could not find container \"956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7\": container with ID starting with 956d1512054ebda995942752d42990245a4d5d00754af581857b9c34e9a5acb7 not found: ID does not exist" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.262622 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6ddc9"] Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.265202 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6ddc9"] Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447234 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b85456b8b-lwmv4"] Jan 22 14:09:07 crc kubenswrapper[4801]: E0122 14:09:07.447413 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f171ae3-24fe-42e0-b8dd-0e733fc33381" containerName="route-controller-manager" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447425 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f171ae3-24fe-42e0-b8dd-0e733fc33381" containerName="route-controller-manager" Jan 22 14:09:07 crc kubenswrapper[4801]: E0122 14:09:07.447439 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fa36ec-07ad-4ebe-8e79-3692f621cd37" containerName="controller-manager" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447460 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fa36ec-07ad-4ebe-8e79-3692f621cd37" containerName="controller-manager" Jan 22 14:09:07 crc kubenswrapper[4801]: E0122 14:09:07.447476 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447484 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447565 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447577 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f171ae3-24fe-42e0-b8dd-0e733fc33381" containerName="route-controller-manager" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447586 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fa36ec-07ad-4ebe-8e79-3692f621cd37" containerName="controller-manager" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.447898 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.450013 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.450365 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.450629 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.451078 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.451586 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.456691 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.464129 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.464274 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b85456b8b-lwmv4"] Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.577958 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fa36ec-07ad-4ebe-8e79-3692f621cd37" path="/var/lib/kubelet/pods/32fa36ec-07ad-4ebe-8e79-3692f621cd37/volumes" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.578670 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f171ae3-24fe-42e0-b8dd-0e733fc33381" path="/var/lib/kubelet/pods/5f171ae3-24fe-42e0-b8dd-0e733fc33381/volumes" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.607924 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-proxy-ca-bundles\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.607990 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbn57\" (UniqueName: \"kubernetes.io/projected/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-kube-api-access-dbn57\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.608020 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-serving-cert\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.608059 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-client-ca\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.608140 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-config\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.709991 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-proxy-ca-bundles\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.710117 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbn57\" (UniqueName: \"kubernetes.io/projected/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-kube-api-access-dbn57\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.710172 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-serving-cert\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.710259 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-client-ca\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.710398 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-config\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.711396 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-client-ca\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.713165 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-config\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.713251 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-proxy-ca-bundles\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.723753 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-serving-cert\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.738878 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbn57\" (UniqueName: \"kubernetes.io/projected/7fd3e58d-34b1-45c9-8b37-7bbbcd343dca-kube-api-access-dbn57\") pod \"controller-manager-5b85456b8b-lwmv4\" (UID: \"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca\") " pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.774085 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:07 crc kubenswrapper[4801]: I0122 14:09:07.990607 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b85456b8b-lwmv4"] Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.222746 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" event={"ID":"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca","Type":"ContainerStarted","Data":"7bdb8979c21f78b35c6247f6f37292120fdfd30a305e4a5a8a22ace8b59fb19a"} Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.223106 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.223117 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" event={"ID":"7fd3e58d-34b1-45c9-8b37-7bbbcd343dca","Type":"ContainerStarted","Data":"5627784d98b7d20b83693c9f91b14e663b1c3ebcaf9e1cdc24a02d1d2e829c3c"} Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.230338 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.248009 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b85456b8b-lwmv4" podStartSLOduration=2.247985887 podStartE2EDuration="2.247985887s" podCreationTimestamp="2026-01-22 14:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:09:08.244570128 +0000 UTC m=+296.946470311" watchObservedRunningTime="2026-01-22 14:09:08.247985887 +0000 UTC m=+296.949886070" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.449013 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv"] Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.449711 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.451748 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.452053 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.452102 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.452265 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.454137 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.456158 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.464392 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv"] Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.531637 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-client-ca\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.531713 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c5beaf-0746-4210-a839-a72db2ab8d80-serving-cert\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.531904 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-config\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.531997 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5657\" (UniqueName: \"kubernetes.io/projected/d0c5beaf-0746-4210-a839-a72db2ab8d80-kube-api-access-q5657\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.632528 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5657\" (UniqueName: \"kubernetes.io/projected/d0c5beaf-0746-4210-a839-a72db2ab8d80-kube-api-access-q5657\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.632651 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-client-ca\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.632707 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c5beaf-0746-4210-a839-a72db2ab8d80-serving-cert\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.632757 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-config\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.634019 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-client-ca\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.634192 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-config\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.642521 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c5beaf-0746-4210-a839-a72db2ab8d80-serving-cert\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.654846 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5657\" (UniqueName: \"kubernetes.io/projected/d0c5beaf-0746-4210-a839-a72db2ab8d80-kube-api-access-q5657\") pod \"route-controller-manager-778d455ff-k67xv\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:08 crc kubenswrapper[4801]: I0122 14:09:08.764177 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.118326 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mcrjs"] Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.119475 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.123551 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.142854 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcrjs"] Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.187174 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv"] Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.231900 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" event={"ID":"d0c5beaf-0746-4210-a839-a72db2ab8d80","Type":"ContainerStarted","Data":"c987369d990ec2b2e6ba6d489d89a501851316cfb218f267a609d06c0f32f1e2"} Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.239870 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa6ed3a-95d8-4d88-9970-b2545b4c5803-utilities\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.240130 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj27f\" (UniqueName: \"kubernetes.io/projected/caa6ed3a-95d8-4d88-9970-b2545b4c5803-kube-api-access-qj27f\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.240221 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa6ed3a-95d8-4d88-9970-b2545b4c5803-catalog-content\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.341144 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj27f\" (UniqueName: \"kubernetes.io/projected/caa6ed3a-95d8-4d88-9970-b2545b4c5803-kube-api-access-qj27f\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.341211 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa6ed3a-95d8-4d88-9970-b2545b4c5803-catalog-content\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.341329 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa6ed3a-95d8-4d88-9970-b2545b4c5803-utilities\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.341805 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa6ed3a-95d8-4d88-9970-b2545b4c5803-catalog-content\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.341883 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa6ed3a-95d8-4d88-9970-b2545b4c5803-utilities\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.361047 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj27f\" (UniqueName: \"kubernetes.io/projected/caa6ed3a-95d8-4d88-9970-b2545b4c5803-kube-api-access-qj27f\") pod \"redhat-operators-mcrjs\" (UID: \"caa6ed3a-95d8-4d88-9970-b2545b4c5803\") " pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.442196 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.514114 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jnccj"] Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.515268 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.517015 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.534945 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnccj"] Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.543637 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-catalog-content\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.543702 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-utilities\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.543853 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvdxq\" (UniqueName: \"kubernetes.io/projected/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-kube-api-access-rvdxq\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.643101 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcrjs"] Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.644124 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvdxq\" (UniqueName: \"kubernetes.io/projected/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-kube-api-access-rvdxq\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.644189 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-catalog-content\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.644226 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-utilities\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.645502 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-utilities\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.645582 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-catalog-content\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.669344 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvdxq\" (UniqueName: \"kubernetes.io/projected/1cbdd90e-f455-4394-9547-a9d49c8f7ffe-kube-api-access-rvdxq\") pod \"redhat-marketplace-jnccj\" (UID: \"1cbdd90e-f455-4394-9547-a9d49c8f7ffe\") " pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:09 crc kubenswrapper[4801]: I0122 14:09:09.848790 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.039434 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnccj"] Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.254575 4801 generic.go:334] "Generic (PLEG): container finished" podID="1cbdd90e-f455-4394-9547-a9d49c8f7ffe" containerID="841f2ba30deff8969e4dffbe53ce80a17dff4e38f37b722070158f446ed406de" exitCode=0 Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.254773 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnccj" event={"ID":"1cbdd90e-f455-4394-9547-a9d49c8f7ffe","Type":"ContainerDied","Data":"841f2ba30deff8969e4dffbe53ce80a17dff4e38f37b722070158f446ed406de"} Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.254873 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnccj" event={"ID":"1cbdd90e-f455-4394-9547-a9d49c8f7ffe","Type":"ContainerStarted","Data":"c741d2ad4e5830adf6a78aea75f3ebd76a975218edf91acc65a2b6ee349ee8d5"} Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.257937 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" event={"ID":"d0c5beaf-0746-4210-a839-a72db2ab8d80","Type":"ContainerStarted","Data":"57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f"} Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.258215 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.264402 4801 generic.go:334] "Generic (PLEG): container finished" podID="caa6ed3a-95d8-4d88-9970-b2545b4c5803" containerID="a73cc7129ad5b838cc6fb787d2a93e9d1d7952ec49c06b566251df292c931cbd" exitCode=0 Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.264509 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcrjs" event={"ID":"caa6ed3a-95d8-4d88-9970-b2545b4c5803","Type":"ContainerDied","Data":"a73cc7129ad5b838cc6fb787d2a93e9d1d7952ec49c06b566251df292c931cbd"} Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.264535 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcrjs" event={"ID":"caa6ed3a-95d8-4d88-9970-b2545b4c5803","Type":"ContainerStarted","Data":"22e4804c1e2325a5edc3683ad9add2d3f371ada0f9511fd6234992aeb769dc18"} Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.274040 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:10 crc kubenswrapper[4801]: I0122 14:09:10.313262 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" podStartSLOduration=4.313245112 podStartE2EDuration="4.313245112s" podCreationTimestamp="2026-01-22 14:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:09:10.310972526 +0000 UTC m=+299.012872709" watchObservedRunningTime="2026-01-22 14:09:10.313245112 +0000 UTC m=+299.015145295" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.272173 4801 generic.go:334] "Generic (PLEG): container finished" podID="1cbdd90e-f455-4394-9547-a9d49c8f7ffe" containerID="448c7e25d460d3a82a29280a85df9278a2c565214614a10082df75fad56e71c6" exitCode=0 Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.272270 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnccj" event={"ID":"1cbdd90e-f455-4394-9547-a9d49c8f7ffe","Type":"ContainerDied","Data":"448c7e25d460d3a82a29280a85df9278a2c565214614a10082df75fad56e71c6"} Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.435392 4801 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.712172 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gbz4b"] Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.715849 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.719063 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.730691 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gbz4b"] Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.773318 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b8f0ee-b547-4b8b-9bac-9803fee56dec-utilities\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.773568 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b8f0ee-b547-4b8b-9bac-9803fee56dec-catalog-content\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.773763 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwf5\" (UniqueName: \"kubernetes.io/projected/50b8f0ee-b547-4b8b-9bac-9803fee56dec-kube-api-access-czwf5\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.875845 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b8f0ee-b547-4b8b-9bac-9803fee56dec-utilities\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.875930 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b8f0ee-b547-4b8b-9bac-9803fee56dec-catalog-content\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.876160 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czwf5\" (UniqueName: \"kubernetes.io/projected/50b8f0ee-b547-4b8b-9bac-9803fee56dec-kube-api-access-czwf5\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.876647 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50b8f0ee-b547-4b8b-9bac-9803fee56dec-catalog-content\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.876725 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50b8f0ee-b547-4b8b-9bac-9803fee56dec-utilities\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:11 crc kubenswrapper[4801]: I0122 14:09:11.904150 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwf5\" (UniqueName: \"kubernetes.io/projected/50b8f0ee-b547-4b8b-9bac-9803fee56dec-kube-api-access-czwf5\") pod \"certified-operators-gbz4b\" (UID: \"50b8f0ee-b547-4b8b-9bac-9803fee56dec\") " pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:12 crc kubenswrapper[4801]: I0122 14:09:12.083643 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:12 crc kubenswrapper[4801]: I0122 14:09:12.284649 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnccj" event={"ID":"1cbdd90e-f455-4394-9547-a9d49c8f7ffe","Type":"ContainerStarted","Data":"3b435b91ce8f24ad562e4bdab673e71652cee4be8f3073f4a977cfe34c5818f6"} Jan 22 14:09:12 crc kubenswrapper[4801]: I0122 14:09:12.288398 4801 generic.go:334] "Generic (PLEG): container finished" podID="caa6ed3a-95d8-4d88-9970-b2545b4c5803" containerID="8f22a54988a45dd11e275f27759073b5eb9c910181a19e97a3cf4eb097f2c3bb" exitCode=0 Jan 22 14:09:12 crc kubenswrapper[4801]: I0122 14:09:12.288693 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcrjs" event={"ID":"caa6ed3a-95d8-4d88-9970-b2545b4c5803","Type":"ContainerDied","Data":"8f22a54988a45dd11e275f27759073b5eb9c910181a19e97a3cf4eb097f2c3bb"} Jan 22 14:09:12 crc kubenswrapper[4801]: I0122 14:09:12.324228 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jnccj" podStartSLOduration=1.866697586 podStartE2EDuration="3.324208222s" podCreationTimestamp="2026-01-22 14:09:09 +0000 UTC" firstStartedPulling="2026-01-22 14:09:10.25627683 +0000 UTC m=+298.958177013" lastFinishedPulling="2026-01-22 14:09:11.713787466 +0000 UTC m=+300.415687649" observedRunningTime="2026-01-22 14:09:12.305271087 +0000 UTC m=+301.007171290" watchObservedRunningTime="2026-01-22 14:09:12.324208222 +0000 UTC m=+301.026108405" Jan 22 14:09:12 crc kubenswrapper[4801]: I0122 14:09:12.502384 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gbz4b"] Jan 22 14:09:12 crc kubenswrapper[4801]: W0122 14:09:12.512748 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b8f0ee_b547_4b8b_9bac_9803fee56dec.slice/crio-59830ed7c91b564c7574baf99fa71f0fc2d8050c7b79dedfdc0252501a89a24d WatchSource:0}: Error finding container 59830ed7c91b564c7574baf99fa71f0fc2d8050c7b79dedfdc0252501a89a24d: Status 404 returned error can't find the container with id 59830ed7c91b564c7574baf99fa71f0fc2d8050c7b79dedfdc0252501a89a24d Jan 22 14:09:13 crc kubenswrapper[4801]: I0122 14:09:13.295334 4801 generic.go:334] "Generic (PLEG): container finished" podID="50b8f0ee-b547-4b8b-9bac-9803fee56dec" containerID="205929ca28ab8f85d2270eeda4e59ede8f429e88fe5b4c4117d7731b660a782d" exitCode=0 Jan 22 14:09:13 crc kubenswrapper[4801]: I0122 14:09:13.295729 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbz4b" event={"ID":"50b8f0ee-b547-4b8b-9bac-9803fee56dec","Type":"ContainerDied","Data":"205929ca28ab8f85d2270eeda4e59ede8f429e88fe5b4c4117d7731b660a782d"} Jan 22 14:09:13 crc kubenswrapper[4801]: I0122 14:09:13.295777 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbz4b" event={"ID":"50b8f0ee-b547-4b8b-9bac-9803fee56dec","Type":"ContainerStarted","Data":"59830ed7c91b564c7574baf99fa71f0fc2d8050c7b79dedfdc0252501a89a24d"} Jan 22 14:09:13 crc kubenswrapper[4801]: I0122 14:09:13.298011 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcrjs" event={"ID":"caa6ed3a-95d8-4d88-9970-b2545b4c5803","Type":"ContainerStarted","Data":"66bb861d4a558b91395cb510bfb984fb9d138ccea6c11110526cac5b7ae26fa6"} Jan 22 14:09:13 crc kubenswrapper[4801]: I0122 14:09:13.331331 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mcrjs" podStartSLOduration=1.812187765 podStartE2EDuration="4.331315974s" podCreationTimestamp="2026-01-22 14:09:09 +0000 UTC" firstStartedPulling="2026-01-22 14:09:10.268031088 +0000 UTC m=+298.969931271" lastFinishedPulling="2026-01-22 14:09:12.787159297 +0000 UTC m=+301.489059480" observedRunningTime="2026-01-22 14:09:13.330324215 +0000 UTC m=+302.032224398" watchObservedRunningTime="2026-01-22 14:09:13.331315974 +0000 UTC m=+302.033216157" Jan 22 14:09:14 crc kubenswrapper[4801]: I0122 14:09:14.305251 4801 generic.go:334] "Generic (PLEG): container finished" podID="50b8f0ee-b547-4b8b-9bac-9803fee56dec" containerID="dcfc42bb6eb3a2fa5302f7b015894eca1c3965e95ffbbf68bc69fe409d0c93e4" exitCode=0 Jan 22 14:09:14 crc kubenswrapper[4801]: I0122 14:09:14.305304 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbz4b" event={"ID":"50b8f0ee-b547-4b8b-9bac-9803fee56dec","Type":"ContainerDied","Data":"dcfc42bb6eb3a2fa5302f7b015894eca1c3965e95ffbbf68bc69fe409d0c93e4"} Jan 22 14:09:15 crc kubenswrapper[4801]: I0122 14:09:15.313582 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gbz4b" event={"ID":"50b8f0ee-b547-4b8b-9bac-9803fee56dec","Type":"ContainerStarted","Data":"0a3830a96d41a3d3855f39dad6fcec631c5d2c4f569934221494bf6dfe04eb4f"} Jan 22 14:09:15 crc kubenswrapper[4801]: I0122 14:09:15.341593 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gbz4b" podStartSLOduration=2.924258047 podStartE2EDuration="4.341576074s" podCreationTimestamp="2026-01-22 14:09:11 +0000 UTC" firstStartedPulling="2026-01-22 14:09:13.297371855 +0000 UTC m=+301.999272038" lastFinishedPulling="2026-01-22 14:09:14.714689882 +0000 UTC m=+303.416590065" observedRunningTime="2026-01-22 14:09:15.340428261 +0000 UTC m=+304.042328484" watchObservedRunningTime="2026-01-22 14:09:15.341576074 +0000 UTC m=+304.043476257" Jan 22 14:09:19 crc kubenswrapper[4801]: I0122 14:09:19.442705 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:19 crc kubenswrapper[4801]: I0122 14:09:19.443204 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:19 crc kubenswrapper[4801]: I0122 14:09:19.494848 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:19 crc kubenswrapper[4801]: I0122 14:09:19.849896 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:19 crc kubenswrapper[4801]: I0122 14:09:19.850253 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:19 crc kubenswrapper[4801]: I0122 14:09:19.890981 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:20 crc kubenswrapper[4801]: I0122 14:09:20.393314 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mcrjs" Jan 22 14:09:20 crc kubenswrapper[4801]: I0122 14:09:20.399165 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jnccj" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.084355 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.084512 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.239889 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.389267 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gbz4b" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.508925 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w898g"] Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.510018 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.511950 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.521098 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w898g"] Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.622527 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jznzb\" (UniqueName: \"kubernetes.io/projected/c97391e4-71b7-4569-b82e-a9cd32d9f439-kube-api-access-jznzb\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.622907 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97391e4-71b7-4569-b82e-a9cd32d9f439-catalog-content\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.623003 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97391e4-71b7-4569-b82e-a9cd32d9f439-utilities\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.724381 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jznzb\" (UniqueName: \"kubernetes.io/projected/c97391e4-71b7-4569-b82e-a9cd32d9f439-kube-api-access-jznzb\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.724437 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97391e4-71b7-4569-b82e-a9cd32d9f439-catalog-content\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.724515 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97391e4-71b7-4569-b82e-a9cd32d9f439-utilities\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.724962 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97391e4-71b7-4569-b82e-a9cd32d9f439-catalog-content\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.725045 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97391e4-71b7-4569-b82e-a9cd32d9f439-utilities\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.744978 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jznzb\" (UniqueName: \"kubernetes.io/projected/c97391e4-71b7-4569-b82e-a9cd32d9f439-kube-api-access-jznzb\") pod \"community-operators-w898g\" (UID: \"c97391e4-71b7-4569-b82e-a9cd32d9f439\") " pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:22 crc kubenswrapper[4801]: I0122 14:09:22.829094 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:23 crc kubenswrapper[4801]: I0122 14:09:23.223786 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w898g"] Jan 22 14:09:23 crc kubenswrapper[4801]: I0122 14:09:23.358103 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w898g" event={"ID":"c97391e4-71b7-4569-b82e-a9cd32d9f439","Type":"ContainerStarted","Data":"3f31d39fae823136a059db59c6a9db3f6450c059ad2295096e93a50452894180"} Jan 22 14:09:26 crc kubenswrapper[4801]: I0122 14:09:26.250495 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv"] Jan 22 14:09:26 crc kubenswrapper[4801]: I0122 14:09:26.251166 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" podUID="d0c5beaf-0746-4210-a839-a72db2ab8d80" containerName="route-controller-manager" containerID="cri-o://57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f" gracePeriod=30 Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.259218 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.289356 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt"] Jan 22 14:09:27 crc kubenswrapper[4801]: E0122 14:09:27.289858 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c5beaf-0746-4210-a839-a72db2ab8d80" containerName="route-controller-manager" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.289985 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c5beaf-0746-4210-a839-a72db2ab8d80" containerName="route-controller-manager" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.290200 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c5beaf-0746-4210-a839-a72db2ab8d80" containerName="route-controller-manager" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.290710 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.316058 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt"] Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.383867 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-config\") pod \"d0c5beaf-0746-4210-a839-a72db2ab8d80\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.384257 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5657\" (UniqueName: \"kubernetes.io/projected/d0c5beaf-0746-4210-a839-a72db2ab8d80-kube-api-access-q5657\") pod \"d0c5beaf-0746-4210-a839-a72db2ab8d80\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.384379 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-client-ca\") pod \"d0c5beaf-0746-4210-a839-a72db2ab8d80\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.384536 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c5beaf-0746-4210-a839-a72db2ab8d80-serving-cert\") pod \"d0c5beaf-0746-4210-a839-a72db2ab8d80\" (UID: \"d0c5beaf-0746-4210-a839-a72db2ab8d80\") " Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.384880 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-config" (OuterVolumeSpecName: "config") pod "d0c5beaf-0746-4210-a839-a72db2ab8d80" (UID: "d0c5beaf-0746-4210-a839-a72db2ab8d80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.385329 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0c5beaf-0746-4210-a839-a72db2ab8d80" (UID: "d0c5beaf-0746-4210-a839-a72db2ab8d80"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.385370 4801 generic.go:334] "Generic (PLEG): container finished" podID="c97391e4-71b7-4569-b82e-a9cd32d9f439" containerID="46569951323fb35a44ff10b7189a4ad465dbdb8c2233220d8765fbd4674bcf48" exitCode=0 Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.385538 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w898g" event={"ID":"c97391e4-71b7-4569-b82e-a9cd32d9f439","Type":"ContainerDied","Data":"46569951323fb35a44ff10b7189a4ad465dbdb8c2233220d8765fbd4674bcf48"} Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.389736 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c5beaf-0746-4210-a839-a72db2ab8d80-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0c5beaf-0746-4210-a839-a72db2ab8d80" (UID: "d0c5beaf-0746-4210-a839-a72db2ab8d80"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.390140 4801 generic.go:334] "Generic (PLEG): container finished" podID="d0c5beaf-0746-4210-a839-a72db2ab8d80" containerID="57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f" exitCode=0 Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.390210 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" event={"ID":"d0c5beaf-0746-4210-a839-a72db2ab8d80","Type":"ContainerDied","Data":"57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f"} Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.390270 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" event={"ID":"d0c5beaf-0746-4210-a839-a72db2ab8d80","Type":"ContainerDied","Data":"c987369d990ec2b2e6ba6d489d89a501851316cfb218f267a609d06c0f32f1e2"} Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.390294 4801 scope.go:117] "RemoveContainer" containerID="57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.390390 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c5beaf-0746-4210-a839-a72db2ab8d80-kube-api-access-q5657" (OuterVolumeSpecName: "kube-api-access-q5657") pod "d0c5beaf-0746-4210-a839-a72db2ab8d80" (UID: "d0c5beaf-0746-4210-a839-a72db2ab8d80"). InnerVolumeSpecName "kube-api-access-q5657". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.390401 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.421901 4801 scope.go:117] "RemoveContainer" containerID="57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f" Jan 22 14:09:27 crc kubenswrapper[4801]: E0122 14:09:27.422903 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f\": container with ID starting with 57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f not found: ID does not exist" containerID="57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.422940 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f"} err="failed to get container status \"57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f\": rpc error: code = NotFound desc = could not find container \"57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f\": container with ID starting with 57b19e52537198fe7e972c5f063b4851f39df1e9597e92dfe46dbece0ecb019f not found: ID does not exist" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.430673 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv"] Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.433735 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778d455ff-k67xv"] Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.486499 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff428140-fb3a-4a23-b84c-aa768ba99d20-serving-cert\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.486572 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgd5\" (UniqueName: \"kubernetes.io/projected/ff428140-fb3a-4a23-b84c-aa768ba99d20-kube-api-access-wzgd5\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.486649 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff428140-fb3a-4a23-b84c-aa768ba99d20-config\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.486904 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff428140-fb3a-4a23-b84c-aa768ba99d20-client-ca\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.487073 4801 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c5beaf-0746-4210-a839-a72db2ab8d80-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.487099 4801 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.487118 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5657\" (UniqueName: \"kubernetes.io/projected/d0c5beaf-0746-4210-a839-a72db2ab8d80-kube-api-access-q5657\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.487136 4801 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c5beaf-0746-4210-a839-a72db2ab8d80-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.581201 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c5beaf-0746-4210-a839-a72db2ab8d80" path="/var/lib/kubelet/pods/d0c5beaf-0746-4210-a839-a72db2ab8d80/volumes" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.588988 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff428140-fb3a-4a23-b84c-aa768ba99d20-client-ca\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.589073 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff428140-fb3a-4a23-b84c-aa768ba99d20-serving-cert\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.589098 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgd5\" (UniqueName: \"kubernetes.io/projected/ff428140-fb3a-4a23-b84c-aa768ba99d20-kube-api-access-wzgd5\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.589126 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff428140-fb3a-4a23-b84c-aa768ba99d20-config\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.592935 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff428140-fb3a-4a23-b84c-aa768ba99d20-config\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.593086 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff428140-fb3a-4a23-b84c-aa768ba99d20-client-ca\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.597972 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff428140-fb3a-4a23-b84c-aa768ba99d20-serving-cert\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.607228 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgd5\" (UniqueName: \"kubernetes.io/projected/ff428140-fb3a-4a23-b84c-aa768ba99d20-kube-api-access-wzgd5\") pod \"route-controller-manager-6db9c85b88-lp8dt\" (UID: \"ff428140-fb3a-4a23-b84c-aa768ba99d20\") " pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:27 crc kubenswrapper[4801]: I0122 14:09:27.617960 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.013943 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt"] Jan 22 14:09:28 crc kubenswrapper[4801]: W0122 14:09:28.021257 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff428140_fb3a_4a23_b84c_aa768ba99d20.slice/crio-b1619d6004dba82a76c48532c64b67b04617ac0b4b8c71e69c38c58d0dc8cab8 WatchSource:0}: Error finding container b1619d6004dba82a76c48532c64b67b04617ac0b4b8c71e69c38c58d0dc8cab8: Status 404 returned error can't find the container with id b1619d6004dba82a76c48532c64b67b04617ac0b4b8c71e69c38c58d0dc8cab8 Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.400703 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" event={"ID":"ff428140-fb3a-4a23-b84c-aa768ba99d20","Type":"ContainerStarted","Data":"53267738efd9a136f0b5a3c612e719ea3cd7dfde30b82fc140e23f094c635501"} Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.401336 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.401383 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" event={"ID":"ff428140-fb3a-4a23-b84c-aa768ba99d20","Type":"ContainerStarted","Data":"b1619d6004dba82a76c48532c64b67b04617ac0b4b8c71e69c38c58d0dc8cab8"} Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.404198 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w898g" event={"ID":"c97391e4-71b7-4569-b82e-a9cd32d9f439","Type":"ContainerStarted","Data":"eb6e56efe876f6117bcfc32656de1420c69cdc1a4c2be3516a50aad158f3342e"} Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.427818 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" podStartSLOduration=2.42779918 podStartE2EDuration="2.42779918s" podCreationTimestamp="2026-01-22 14:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:09:28.424432013 +0000 UTC m=+317.126332206" watchObservedRunningTime="2026-01-22 14:09:28.42779918 +0000 UTC m=+317.129699373" Jan 22 14:09:28 crc kubenswrapper[4801]: I0122 14:09:28.855401 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6db9c85b88-lp8dt" Jan 22 14:09:29 crc kubenswrapper[4801]: I0122 14:09:29.411825 4801 generic.go:334] "Generic (PLEG): container finished" podID="c97391e4-71b7-4569-b82e-a9cd32d9f439" containerID="eb6e56efe876f6117bcfc32656de1420c69cdc1a4c2be3516a50aad158f3342e" exitCode=0 Jan 22 14:09:29 crc kubenswrapper[4801]: I0122 14:09:29.411863 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w898g" event={"ID":"c97391e4-71b7-4569-b82e-a9cd32d9f439","Type":"ContainerDied","Data":"eb6e56efe876f6117bcfc32656de1420c69cdc1a4c2be3516a50aad158f3342e"} Jan 22 14:09:30 crc kubenswrapper[4801]: I0122 14:09:30.418558 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w898g" event={"ID":"c97391e4-71b7-4569-b82e-a9cd32d9f439","Type":"ContainerStarted","Data":"33b5b3c34e63f9f9fc906b0520cb120aa2ea18a2a9dfad76f046d73075b85a6e"} Jan 22 14:09:30 crc kubenswrapper[4801]: I0122 14:09:30.435381 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w898g" podStartSLOduration=5.719260676 podStartE2EDuration="8.435362793s" podCreationTimestamp="2026-01-22 14:09:22 +0000 UTC" firstStartedPulling="2026-01-22 14:09:27.386996077 +0000 UTC m=+316.088896260" lastFinishedPulling="2026-01-22 14:09:30.103098194 +0000 UTC m=+318.804998377" observedRunningTime="2026-01-22 14:09:30.434489837 +0000 UTC m=+319.136390020" watchObservedRunningTime="2026-01-22 14:09:30.435362793 +0000 UTC m=+319.137262976" Jan 22 14:09:32 crc kubenswrapper[4801]: I0122 14:09:32.829416 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:32 crc kubenswrapper[4801]: I0122 14:09:32.829860 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:32 crc kubenswrapper[4801]: I0122 14:09:32.866714 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:34 crc kubenswrapper[4801]: I0122 14:09:34.021703 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:09:34 crc kubenswrapper[4801]: I0122 14:09:34.021775 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:09:42 crc kubenswrapper[4801]: I0122 14:09:42.876707 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w898g" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.331695 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4prlp"] Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.332910 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.383528 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4prlp"] Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.430833 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-registry-tls\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.430895 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkvn\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-kube-api-access-6qkvn\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.430919 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c08a7a92-6f5b-4d69-8193-dae310a3cd52-registry-certificates\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.430968 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.431002 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c08a7a92-6f5b-4d69-8193-dae310a3cd52-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.431018 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-bound-sa-token\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.431046 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c08a7a92-6f5b-4d69-8193-dae310a3cd52-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.431168 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c08a7a92-6f5b-4d69-8193-dae310a3cd52-trusted-ca\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.465275 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537133 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c08a7a92-6f5b-4d69-8193-dae310a3cd52-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537204 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c08a7a92-6f5b-4d69-8193-dae310a3cd52-trusted-ca\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537270 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-registry-tls\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537297 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkvn\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-kube-api-access-6qkvn\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537321 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c08a7a92-6f5b-4d69-8193-dae310a3cd52-registry-certificates\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537368 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c08a7a92-6f5b-4d69-8193-dae310a3cd52-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.537388 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-bound-sa-token\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.539215 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c08a7a92-6f5b-4d69-8193-dae310a3cd52-trusted-ca\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.540695 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c08a7a92-6f5b-4d69-8193-dae310a3cd52-registry-certificates\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.541340 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c08a7a92-6f5b-4d69-8193-dae310a3cd52-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.546870 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c08a7a92-6f5b-4d69-8193-dae310a3cd52-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.551703 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-registry-tls\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.561302 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkvn\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-kube-api-access-6qkvn\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.561870 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c08a7a92-6f5b-4d69-8193-dae310a3cd52-bound-sa-token\") pod \"image-registry-66df7c8f76-4prlp\" (UID: \"c08a7a92-6f5b-4d69-8193-dae310a3cd52\") " pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:53 crc kubenswrapper[4801]: I0122 14:09:53.652595 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:54 crc kubenswrapper[4801]: I0122 14:09:54.106301 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4prlp"] Jan 22 14:09:54 crc kubenswrapper[4801]: I0122 14:09:54.580700 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" event={"ID":"c08a7a92-6f5b-4d69-8193-dae310a3cd52","Type":"ContainerStarted","Data":"046b112778343d64f3e4f57b2744baedae5ee7896267af21faa45ede057d6929"} Jan 22 14:09:54 crc kubenswrapper[4801]: I0122 14:09:54.580767 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" event={"ID":"c08a7a92-6f5b-4d69-8193-dae310a3cd52","Type":"ContainerStarted","Data":"d96636e38ac6fbbe187b71a2c112a8161defa1d3ebbf61dda0f81e3644a00b16"} Jan 22 14:09:54 crc kubenswrapper[4801]: I0122 14:09:54.580864 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:09:54 crc kubenswrapper[4801]: I0122 14:09:54.606843 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" podStartSLOduration=1.606820439 podStartE2EDuration="1.606820439s" podCreationTimestamp="2026-01-22 14:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:09:54.604210984 +0000 UTC m=+343.306111207" watchObservedRunningTime="2026-01-22 14:09:54.606820439 +0000 UTC m=+343.308720652" Jan 22 14:10:04 crc kubenswrapper[4801]: I0122 14:10:04.021984 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:10:04 crc kubenswrapper[4801]: I0122 14:10:04.022931 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:10:13 crc kubenswrapper[4801]: I0122 14:10:13.660288 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4prlp" Jan 22 14:10:13 crc kubenswrapper[4801]: I0122 14:10:13.719989 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zckqh"] Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.021802 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.022562 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.022634 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.023588 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05f9bc84ca95ce886a74f9a3709eccbccd35ca90eb20e5f1f29ed5cdd447357e"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.023715 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://05f9bc84ca95ce886a74f9a3709eccbccd35ca90eb20e5f1f29ed5cdd447357e" gracePeriod=600 Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.837334 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="05f9bc84ca95ce886a74f9a3709eccbccd35ca90eb20e5f1f29ed5cdd447357e" exitCode=0 Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.837412 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"05f9bc84ca95ce886a74f9a3709eccbccd35ca90eb20e5f1f29ed5cdd447357e"} Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.837872 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"2947974c20c12cb11a19ecd994e635ad7fbf51a0a5e93f59ad73397d81838cba"} Jan 22 14:10:34 crc kubenswrapper[4801]: I0122 14:10:34.837892 4801 scope.go:117] "RemoveContainer" containerID="ba6c8cab7fa0c5f4000875a21da35b9bfd3d98e52e742a2bc6ee7fbc86993f82" Jan 22 14:10:38 crc kubenswrapper[4801]: I0122 14:10:38.775178 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" podUID="25f78c50-1bd7-4abd-b231-a932aa15f2af" containerName="registry" containerID="cri-o://9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e" gracePeriod=30 Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.206651 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.302999 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqz8s\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-kube-api-access-tqz8s\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303325 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303514 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25f78c50-1bd7-4abd-b231-a932aa15f2af-installation-pull-secrets\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303551 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-tls\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303607 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25f78c50-1bd7-4abd-b231-a932aa15f2af-ca-trust-extracted\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303645 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-trusted-ca\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303671 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-certificates\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.303705 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-bound-sa-token\") pod \"25f78c50-1bd7-4abd-b231-a932aa15f2af\" (UID: \"25f78c50-1bd7-4abd-b231-a932aa15f2af\") " Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.304806 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.304987 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.310100 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25f78c50-1bd7-4abd-b231-a932aa15f2af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.310224 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.310888 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.312848 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.313928 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-kube-api-access-tqz8s" (OuterVolumeSpecName: "kube-api-access-tqz8s") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "kube-api-access-tqz8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.324562 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f78c50-1bd7-4abd-b231-a932aa15f2af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "25f78c50-1bd7-4abd-b231-a932aa15f2af" (UID: "25f78c50-1bd7-4abd-b231-a932aa15f2af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.404934 4801 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/25f78c50-1bd7-4abd-b231-a932aa15f2af-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.404968 4801 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.404981 4801 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/25f78c50-1bd7-4abd-b231-a932aa15f2af-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.404989 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.404998 4801 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/25f78c50-1bd7-4abd-b231-a932aa15f2af-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.405005 4801 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.405013 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqz8s\" (UniqueName: \"kubernetes.io/projected/25f78c50-1bd7-4abd-b231-a932aa15f2af-kube-api-access-tqz8s\") on node \"crc\" DevicePath \"\"" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.873923 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.873927 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" event={"ID":"25f78c50-1bd7-4abd-b231-a932aa15f2af","Type":"ContainerDied","Data":"9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e"} Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.873991 4801 scope.go:117] "RemoveContainer" containerID="9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.873805 4801 generic.go:334] "Generic (PLEG): container finished" podID="25f78c50-1bd7-4abd-b231-a932aa15f2af" containerID="9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e" exitCode=0 Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.874574 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zckqh" event={"ID":"25f78c50-1bd7-4abd-b231-a932aa15f2af","Type":"ContainerDied","Data":"eb5208e2098a19caa73387149dc2069eafac0ec4e5adc52c1eca25a990af40d1"} Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.892492 4801 scope.go:117] "RemoveContainer" containerID="9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e" Jan 22 14:10:39 crc kubenswrapper[4801]: E0122 14:10:39.893413 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e\": container with ID starting with 9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e not found: ID does not exist" containerID="9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.893441 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e"} err="failed to get container status \"9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e\": rpc error: code = NotFound desc = could not find container \"9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e\": container with ID starting with 9668578cc3fdad5ef4d1b60a259f81a9e6acc62c5f7eaa65519a6d3f8b48b22e not found: ID does not exist" Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.901059 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zckqh"] Jan 22 14:10:39 crc kubenswrapper[4801]: I0122 14:10:39.904818 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zckqh"] Jan 22 14:10:41 crc kubenswrapper[4801]: I0122 14:10:41.577989 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f78c50-1bd7-4abd-b231-a932aa15f2af" path="/var/lib/kubelet/pods/25f78c50-1bd7-4abd-b231-a932aa15f2af/volumes" Jan 22 14:12:34 crc kubenswrapper[4801]: I0122 14:12:34.021378 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:12:34 crc kubenswrapper[4801]: I0122 14:12:34.022198 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:13:04 crc kubenswrapper[4801]: I0122 14:13:04.020908 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:13:04 crc kubenswrapper[4801]: I0122 14:13:04.021711 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:13:11 crc kubenswrapper[4801]: I0122 14:13:11.767806 4801 scope.go:117] "RemoveContainer" containerID="0aa0d456c3eca4be34a250b5b30a17825ea50f5a2c87a8a3741bcf7c837a5ae4" Jan 22 14:13:11 crc kubenswrapper[4801]: I0122 14:13:11.794491 4801 scope.go:117] "RemoveContainer" containerID="25beb75e155fa4d2ed2de7ce883421880a66b352ae0f7b441cae4f3052ed071b" Jan 22 14:13:11 crc kubenswrapper[4801]: I0122 14:13:11.830105 4801 scope.go:117] "RemoveContainer" containerID="9691f1f23d44d28dcf6f1a49b91033239fe920cb5b4614c27ad9c936772b5e5e" Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.021033 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.022716 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.022860 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.024114 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2947974c20c12cb11a19ecd994e635ad7fbf51a0a5e93f59ad73397d81838cba"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.024397 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://2947974c20c12cb11a19ecd994e635ad7fbf51a0a5e93f59ad73397d81838cba" gracePeriod=600 Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.913458 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="2947974c20c12cb11a19ecd994e635ad7fbf51a0a5e93f59ad73397d81838cba" exitCode=0 Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.913472 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"2947974c20c12cb11a19ecd994e635ad7fbf51a0a5e93f59ad73397d81838cba"} Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.913833 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"60a49b0ab9a653bb3640000258ad0704ac97f382b39d4afd913a38b2e74008da"} Jan 22 14:13:34 crc kubenswrapper[4801]: I0122 14:13:34.913852 4801 scope.go:117] "RemoveContainer" containerID="05f9bc84ca95ce886a74f9a3709eccbccd35ca90eb20e5f1f29ed5cdd447357e" Jan 22 14:14:11 crc kubenswrapper[4801]: I0122 14:14:11.865829 4801 scope.go:117] "RemoveContainer" containerID="ba11f4a6ec85ace6916a697aaa6b17f56031308791adc418ae0cd5ccfb1b62db" Jan 22 14:14:11 crc kubenswrapper[4801]: I0122 14:14:11.898660 4801 scope.go:117] "RemoveContainer" containerID="6f264e91265c144390c213c6edfc9fb25ddba679a12c666f728ad8f02f208928" Jan 22 14:14:11 crc kubenswrapper[4801]: I0122 14:14:11.929990 4801 scope.go:117] "RemoveContainer" containerID="fc56c87d6f7ca7e129f80456c71d84b31ae945343cbdb23d79a284b87d31c230" Jan 22 14:14:11 crc kubenswrapper[4801]: I0122 14:14:11.943593 4801 scope.go:117] "RemoveContainer" containerID="7762490a9c1e6a87b3617bd1e3025e5db0da2c8b193431d867128fdb599eebfb" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.170010 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt"] Jan 22 14:15:00 crc kubenswrapper[4801]: E0122 14:15:00.170838 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f78c50-1bd7-4abd-b231-a932aa15f2af" containerName="registry" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.170857 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f78c50-1bd7-4abd-b231-a932aa15f2af" containerName="registry" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.171024 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f78c50-1bd7-4abd-b231-a932aa15f2af" containerName="registry" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.171561 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.173712 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.181015 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt"] Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.181206 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.515282 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb2cecc5-4755-4c56-aa88-ab490092e01a-secret-volume\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.515438 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99scr\" (UniqueName: \"kubernetes.io/projected/bb2cecc5-4755-4c56-aa88-ab490092e01a-kube-api-access-99scr\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.515496 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb2cecc5-4755-4c56-aa88-ab490092e01a-config-volume\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.617651 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99scr\" (UniqueName: \"kubernetes.io/projected/bb2cecc5-4755-4c56-aa88-ab490092e01a-kube-api-access-99scr\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.617756 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb2cecc5-4755-4c56-aa88-ab490092e01a-config-volume\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.617846 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb2cecc5-4755-4c56-aa88-ab490092e01a-secret-volume\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.619195 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb2cecc5-4755-4c56-aa88-ab490092e01a-config-volume\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.626523 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb2cecc5-4755-4c56-aa88-ab490092e01a-secret-volume\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.637749 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99scr\" (UniqueName: \"kubernetes.io/projected/bb2cecc5-4755-4c56-aa88-ab490092e01a-kube-api-access-99scr\") pod \"collect-profiles-29484855-t9hzt\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.801654 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:00 crc kubenswrapper[4801]: I0122 14:15:00.982928 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt"] Jan 22 14:15:01 crc kubenswrapper[4801]: I0122 14:15:01.449937 4801 generic.go:334] "Generic (PLEG): container finished" podID="bb2cecc5-4755-4c56-aa88-ab490092e01a" containerID="22290f7f210e70d459f4e73f05025d26671ba902ddbe0bdc8a4857a8a7bb15ff" exitCode=0 Jan 22 14:15:01 crc kubenswrapper[4801]: I0122 14:15:01.450078 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" event={"ID":"bb2cecc5-4755-4c56-aa88-ab490092e01a","Type":"ContainerDied","Data":"22290f7f210e70d459f4e73f05025d26671ba902ddbe0bdc8a4857a8a7bb15ff"} Jan 22 14:15:01 crc kubenswrapper[4801]: I0122 14:15:01.450263 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" event={"ID":"bb2cecc5-4755-4c56-aa88-ab490092e01a","Type":"ContainerStarted","Data":"2510d8af52fd681aa11f501d7bb6ad43abe335353c40757ad6cfbf2e4cdc7da8"} Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.638648 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.744518 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb2cecc5-4755-4c56-aa88-ab490092e01a-config-volume\") pod \"bb2cecc5-4755-4c56-aa88-ab490092e01a\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.744648 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99scr\" (UniqueName: \"kubernetes.io/projected/bb2cecc5-4755-4c56-aa88-ab490092e01a-kube-api-access-99scr\") pod \"bb2cecc5-4755-4c56-aa88-ab490092e01a\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.744668 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb2cecc5-4755-4c56-aa88-ab490092e01a-secret-volume\") pod \"bb2cecc5-4755-4c56-aa88-ab490092e01a\" (UID: \"bb2cecc5-4755-4c56-aa88-ab490092e01a\") " Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.745413 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2cecc5-4755-4c56-aa88-ab490092e01a-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb2cecc5-4755-4c56-aa88-ab490092e01a" (UID: "bb2cecc5-4755-4c56-aa88-ab490092e01a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.749674 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2cecc5-4755-4c56-aa88-ab490092e01a-kube-api-access-99scr" (OuterVolumeSpecName: "kube-api-access-99scr") pod "bb2cecc5-4755-4c56-aa88-ab490092e01a" (UID: "bb2cecc5-4755-4c56-aa88-ab490092e01a"). InnerVolumeSpecName "kube-api-access-99scr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.749773 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2cecc5-4755-4c56-aa88-ab490092e01a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb2cecc5-4755-4c56-aa88-ab490092e01a" (UID: "bb2cecc5-4755-4c56-aa88-ab490092e01a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.846189 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99scr\" (UniqueName: \"kubernetes.io/projected/bb2cecc5-4755-4c56-aa88-ab490092e01a-kube-api-access-99scr\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.846223 4801 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb2cecc5-4755-4c56-aa88-ab490092e01a-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:02 crc kubenswrapper[4801]: I0122 14:15:02.846234 4801 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb2cecc5-4755-4c56-aa88-ab490092e01a-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:03 crc kubenswrapper[4801]: I0122 14:15:03.460861 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" event={"ID":"bb2cecc5-4755-4c56-aa88-ab490092e01a","Type":"ContainerDied","Data":"2510d8af52fd681aa11f501d7bb6ad43abe335353c40757ad6cfbf2e4cdc7da8"} Jan 22 14:15:03 crc kubenswrapper[4801]: I0122 14:15:03.460899 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2510d8af52fd681aa11f501d7bb6ad43abe335353c40757ad6cfbf2e4cdc7da8" Jan 22 14:15:03 crc kubenswrapper[4801]: I0122 14:15:03.460962 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484855-t9hzt" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.211200 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx7sl"] Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.211977 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-controller" containerID="cri-o://02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.212106 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-node" containerID="cri-o://049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.212084 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="nbdb" containerID="cri-o://a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.212250 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.212305 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="sbdb" containerID="cri-o://9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.212332 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-acl-logging" containerID="cri-o://79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.214468 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="northd" containerID="cri-o://765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.291749 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" containerID="cri-o://95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" gracePeriod=30 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.595093 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/3.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.597424 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovn-acl-logging/0.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.597732 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/2.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.597932 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovn-controller/0.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.598075 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/1.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.598104 4801 generic.go:334] "Generic (PLEG): container finished" podID="82373306-6578-4229-851f-1d80cdabf2d7" containerID="c3af53c510e8852d6f71e64a3ff43342641d6b251fd5f757632ac020558b170b" exitCode=2 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.598801 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerDied","Data":"c3af53c510e8852d6f71e64a3ff43342641d6b251fd5f757632ac020558b170b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.598842 4801 scope.go:117] "RemoveContainer" containerID="bc4f10b04c1b3e20ecab3f8d5d6ea0904e277be4e448152137eae1bab5eae1f4" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.599161 4801 scope.go:117] "RemoveContainer" containerID="c3af53c510e8852d6f71e64a3ff43342641d6b251fd5f757632ac020558b170b" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.599384 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-b2p9x_openshift-multus(82373306-6578-4229-851f-1d80cdabf2d7)\"" pod="openshift-multus/multus-b2p9x" podUID="82373306-6578-4229-851f-1d80cdabf2d7" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.599859 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.605282 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovnkube-controller/3.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.607913 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovn-acl-logging/0.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608539 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx7sl_33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/ovn-controller/0.log" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608857 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" exitCode=0 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608882 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" exitCode=0 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608890 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" exitCode=0 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608900 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" exitCode=0 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608908 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" exitCode=0 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608917 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" exitCode=0 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608923 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" exitCode=143 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608893 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608966 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608982 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608995 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609008 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609026 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609039 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609052 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609059 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609066 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609072 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609078 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609085 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609091 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609098 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609104 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609113 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609123 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609132 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609139 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609146 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609153 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609160 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609166 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609172 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609178 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609186 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609195 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609206 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609213 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609220 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609226 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609233 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609240 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609246 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609252 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609258 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609265 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.608930 4801 generic.go:334] "Generic (PLEG): container finished" podID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" exitCode=143 Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609292 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" event={"ID":"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b","Type":"ContainerDied","Data":"e45623a049261ffc64fd8636b4c0333cc8ce5a9ec0c994c617626f6b8a550e20"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609303 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609310 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609318 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609325 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609333 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609340 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609347 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609354 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609361 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.609367 4801 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.629489 4801 scope.go:117] "RemoveContainer" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.647848 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.684743 4801 scope.go:117] "RemoveContainer" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691056 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whjt9"] Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691370 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691390 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691410 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691416 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691424 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-acl-logging" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691430 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-acl-logging" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691439 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="nbdb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691461 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="nbdb" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691470 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="northd" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691476 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="northd" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691482 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-node" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691488 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-node" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691494 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691501 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691509 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kubecfg-setup" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691541 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kubecfg-setup" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691551 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691558 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691564 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="sbdb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691570 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="sbdb" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691576 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691581 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691590 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691595 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691602 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2cecc5-4755-4c56-aa88-ab490092e01a" containerName="collect-profiles" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691607 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2cecc5-4755-4c56-aa88-ab490092e01a" containerName="collect-profiles" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691690 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691700 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-node" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691706 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691713 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691721 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="northd" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691729 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovn-acl-logging" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691735 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2cecc5-4755-4c56-aa88-ab490092e01a" containerName="collect-profiles" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691742 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691751 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691756 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691764 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="nbdb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691771 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="sbdb" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.691861 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691868 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.691948 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" containerName="ovnkube-controller" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.696754 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.714336 4801 scope.go:117] "RemoveContainer" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.728283 4801 scope.go:117] "RemoveContainer" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.742655 4801 scope.go:117] "RemoveContainer" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.758330 4801 scope.go:117] "RemoveContainer" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.770743 4801 scope.go:117] "RemoveContainer" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786235 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-systemd\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786278 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-netd\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786302 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-ovn\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786327 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovn-node-metrics-cert\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786350 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-bin\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786368 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-kubelet\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786391 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-openvswitch\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786423 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-ovn-kubernetes\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786427 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786462 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-systemd-units\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786487 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-var-lib-openvswitch\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786500 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786509 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786528 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-netns\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786534 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786545 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786554 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvwm2\" (UniqueName: \"kubernetes.io/projected/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-kube-api-access-cvwm2\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786615 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-etc-openvswitch\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786651 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-script-lib\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786691 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-log-socket\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786727 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-env-overrides\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786752 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-config\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786817 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-node-log\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786853 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-slash\") pod \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\" (UID: \"33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b\") " Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786568 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786591 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786942 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786964 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.786988 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-log-socket" (OuterVolumeSpecName: "log-socket") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787035 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787510 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-slash" (OuterVolumeSpecName: "host-slash") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787533 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787550 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-node-log" (OuterVolumeSpecName: "node-log") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787650 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-kubelet\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787668 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787662 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787703 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-run-netns\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787788 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-systemd-units\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787804 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787884 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-run-ovn-kubernetes\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787938 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovnkube-script-lib\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787966 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-node-log\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.787995 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovn-node-metrics-cert\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788041 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-cni-bin\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788063 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-systemd\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788080 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788115 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovnkube-config\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788156 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788187 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-var-lib-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788223 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x55rk\" (UniqueName: \"kubernetes.io/projected/151869cc-a8d3-4916-9c75-ac9b8c74f942-kube-api-access-x55rk\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788249 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788290 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-env-overrides\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788329 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-etc-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788353 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-ovn\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788382 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-slash\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788411 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-log-socket\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788432 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-cni-netd\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788557 4801 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-node-log\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788574 4801 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-slash\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788587 4801 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788601 4801 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788594 4801 scope.go:117] "RemoveContainer" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788654 4801 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788669 4801 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788683 4801 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788694 4801 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788708 4801 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788753 4801 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788791 4801 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788810 4801 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788821 4801 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788832 4801 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788843 4801 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-log-socket\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788854 4801 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.788864 4801 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.792756 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.793009 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-kube-api-access-cvwm2" (OuterVolumeSpecName: "kube-api-access-cvwm2") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "kube-api-access-cvwm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.803382 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" (UID: "33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.805567 4801 scope.go:117] "RemoveContainer" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.819075 4801 scope.go:117] "RemoveContainer" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.819404 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": container with ID starting with 95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a not found: ID does not exist" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.819435 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} err="failed to get container status \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": rpc error: code = NotFound desc = could not find container \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": container with ID starting with 95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.819493 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.819849 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": container with ID starting with ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980 not found: ID does not exist" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.819870 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} err="failed to get container status \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": rpc error: code = NotFound desc = could not find container \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": container with ID starting with ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.819884 4801 scope.go:117] "RemoveContainer" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.820076 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": container with ID starting with 9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b not found: ID does not exist" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820095 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} err="failed to get container status \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": rpc error: code = NotFound desc = could not find container \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": container with ID starting with 9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820106 4801 scope.go:117] "RemoveContainer" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.820330 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": container with ID starting with a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7 not found: ID does not exist" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820348 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} err="failed to get container status \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": rpc error: code = NotFound desc = could not find container \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": container with ID starting with a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820360 4801 scope.go:117] "RemoveContainer" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.820546 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": container with ID starting with 765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f not found: ID does not exist" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820563 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} err="failed to get container status \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": rpc error: code = NotFound desc = could not find container \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": container with ID starting with 765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820575 4801 scope.go:117] "RemoveContainer" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.820810 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": container with ID starting with 488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12 not found: ID does not exist" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820834 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} err="failed to get container status \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": rpc error: code = NotFound desc = could not find container \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": container with ID starting with 488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.820846 4801 scope.go:117] "RemoveContainer" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.821048 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": container with ID starting with 049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847 not found: ID does not exist" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821068 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} err="failed to get container status \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": rpc error: code = NotFound desc = could not find container \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": container with ID starting with 049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821079 4801 scope.go:117] "RemoveContainer" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.821255 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": container with ID starting with 79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb not found: ID does not exist" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821276 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} err="failed to get container status \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": rpc error: code = NotFound desc = could not find container \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": container with ID starting with 79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821288 4801 scope.go:117] "RemoveContainer" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.821515 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": container with ID starting with 02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08 not found: ID does not exist" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821540 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} err="failed to get container status \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": rpc error: code = NotFound desc = could not find container \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": container with ID starting with 02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821554 4801 scope.go:117] "RemoveContainer" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" Jan 22 14:15:27 crc kubenswrapper[4801]: E0122 14:15:27.821800 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": container with ID starting with 4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b not found: ID does not exist" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821823 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} err="failed to get container status \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": rpc error: code = NotFound desc = could not find container \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": container with ID starting with 4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.821836 4801 scope.go:117] "RemoveContainer" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822044 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} err="failed to get container status \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": rpc error: code = NotFound desc = could not find container \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": container with ID starting with 95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822065 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822256 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} err="failed to get container status \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": rpc error: code = NotFound desc = could not find container \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": container with ID starting with ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822277 4801 scope.go:117] "RemoveContainer" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822508 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} err="failed to get container status \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": rpc error: code = NotFound desc = could not find container \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": container with ID starting with 9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822530 4801 scope.go:117] "RemoveContainer" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822702 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} err="failed to get container status \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": rpc error: code = NotFound desc = could not find container \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": container with ID starting with a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822722 4801 scope.go:117] "RemoveContainer" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822908 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} err="failed to get container status \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": rpc error: code = NotFound desc = could not find container \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": container with ID starting with 765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.822930 4801 scope.go:117] "RemoveContainer" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823095 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} err="failed to get container status \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": rpc error: code = NotFound desc = could not find container \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": container with ID starting with 488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823115 4801 scope.go:117] "RemoveContainer" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823311 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} err="failed to get container status \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": rpc error: code = NotFound desc = could not find container \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": container with ID starting with 049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823332 4801 scope.go:117] "RemoveContainer" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823622 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} err="failed to get container status \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": rpc error: code = NotFound desc = could not find container \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": container with ID starting with 79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823645 4801 scope.go:117] "RemoveContainer" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823911 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} err="failed to get container status \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": rpc error: code = NotFound desc = could not find container \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": container with ID starting with 02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.823933 4801 scope.go:117] "RemoveContainer" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824143 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} err="failed to get container status \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": rpc error: code = NotFound desc = could not find container \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": container with ID starting with 4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824165 4801 scope.go:117] "RemoveContainer" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824370 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} err="failed to get container status \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": rpc error: code = NotFound desc = could not find container \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": container with ID starting with 95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824392 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824598 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} err="failed to get container status \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": rpc error: code = NotFound desc = could not find container \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": container with ID starting with ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824621 4801 scope.go:117] "RemoveContainer" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824841 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} err="failed to get container status \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": rpc error: code = NotFound desc = could not find container \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": container with ID starting with 9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.824864 4801 scope.go:117] "RemoveContainer" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825021 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} err="failed to get container status \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": rpc error: code = NotFound desc = could not find container \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": container with ID starting with a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825042 4801 scope.go:117] "RemoveContainer" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825224 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} err="failed to get container status \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": rpc error: code = NotFound desc = could not find container \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": container with ID starting with 765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825299 4801 scope.go:117] "RemoveContainer" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825590 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} err="failed to get container status \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": rpc error: code = NotFound desc = could not find container \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": container with ID starting with 488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825609 4801 scope.go:117] "RemoveContainer" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825903 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} err="failed to get container status \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": rpc error: code = NotFound desc = could not find container \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": container with ID starting with 049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.825957 4801 scope.go:117] "RemoveContainer" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.826270 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} err="failed to get container status \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": rpc error: code = NotFound desc = could not find container \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": container with ID starting with 79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.826289 4801 scope.go:117] "RemoveContainer" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.826630 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} err="failed to get container status \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": rpc error: code = NotFound desc = could not find container \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": container with ID starting with 02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.826650 4801 scope.go:117] "RemoveContainer" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.826887 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} err="failed to get container status \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": rpc error: code = NotFound desc = could not find container \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": container with ID starting with 4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.826906 4801 scope.go:117] "RemoveContainer" containerID="95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827121 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a"} err="failed to get container status \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": rpc error: code = NotFound desc = could not find container \"95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a\": container with ID starting with 95c8f7ee455c5cfb6c9368786b8f574c8f7c072d8788a456a1706f97af32190a not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827143 4801 scope.go:117] "RemoveContainer" containerID="ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827375 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980"} err="failed to get container status \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": rpc error: code = NotFound desc = could not find container \"ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980\": container with ID starting with ba8f6dac4c2f9b92b4aabe24aa214fc9e702febc25fd76b036337e24d73b9980 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827402 4801 scope.go:117] "RemoveContainer" containerID="9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827672 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b"} err="failed to get container status \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": rpc error: code = NotFound desc = could not find container \"9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b\": container with ID starting with 9f5896c51beb589df5ded0292e44aba8a4f647838e5d717931d90da2e2a8f52b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827694 4801 scope.go:117] "RemoveContainer" containerID="a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827896 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7"} err="failed to get container status \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": rpc error: code = NotFound desc = could not find container \"a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7\": container with ID starting with a2b8347850f0df8cb185fc0b797255cf630a79871c82a2f1f604ff37112b5fc7 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.827911 4801 scope.go:117] "RemoveContainer" containerID="765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828087 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f"} err="failed to get container status \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": rpc error: code = NotFound desc = could not find container \"765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f\": container with ID starting with 765746db12235a43e1dcacd0fcdec94d6c0f9c4f9d6f3c61d18639065aa97f9f not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828103 4801 scope.go:117] "RemoveContainer" containerID="488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828302 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12"} err="failed to get container status \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": rpc error: code = NotFound desc = could not find container \"488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12\": container with ID starting with 488fd6fd8da72a4b407d4a52734dfac8b8f4a1da7fd00c656fb1ad105a8fda12 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828319 4801 scope.go:117] "RemoveContainer" containerID="049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828566 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847"} err="failed to get container status \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": rpc error: code = NotFound desc = could not find container \"049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847\": container with ID starting with 049612b390b7c631ebf028d8a45b59f48f00b7ce50723591b56811d469a17847 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828589 4801 scope.go:117] "RemoveContainer" containerID="79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828780 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb"} err="failed to get container status \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": rpc error: code = NotFound desc = could not find container \"79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb\": container with ID starting with 79170b6d6044fd4be1837f230d3d2816f211e51c795ff4f53bcd0cdf2fd765eb not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828802 4801 scope.go:117] "RemoveContainer" containerID="02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.828984 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08"} err="failed to get container status \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": rpc error: code = NotFound desc = could not find container \"02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08\": container with ID starting with 02d76963165b9f842f5b7de882971a642f0d8774eb90548763edfa5f236b2a08 not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.829008 4801 scope.go:117] "RemoveContainer" containerID="4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.829204 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b"} err="failed to get container status \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": rpc error: code = NotFound desc = could not find container \"4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b\": container with ID starting with 4ac6007e207e28fca7ce2a7aaba88e84f5d1448e2c77e48a3f17dcf10819c29b not found: ID does not exist" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890345 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-cni-bin\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890395 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-systemd\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890425 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovnkube-config\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890462 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-var-lib-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890486 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890515 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x55rk\" (UniqueName: \"kubernetes.io/projected/151869cc-a8d3-4916-9c75-ac9b8c74f942-kube-api-access-x55rk\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890533 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890559 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-env-overrides\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890585 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-etc-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890606 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-ovn\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890628 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-slash\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890651 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-log-socket\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890670 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-cni-netd\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890694 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-kubelet\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890716 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-run-netns\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890755 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-systemd-units\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890789 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-run-ovn-kubernetes\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890810 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovnkube-script-lib\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890807 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890863 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-node-log\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890832 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-node-log\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890872 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-slash\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890911 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovn-node-metrics-cert\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890872 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-run-netns\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890960 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-systemd\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890961 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-systemd-units\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890977 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-cni-netd\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890996 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-log-socket\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.890999 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-run-ovn-kubernetes\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891025 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-cni-bin\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891358 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-var-lib-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891393 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-etc-openvswitch\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891401 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-kubelet\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891392 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891512 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/151869cc-a8d3-4916-9c75-ac9b8c74f942-run-ovn\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891599 4801 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891613 4801 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.891630 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvwm2\" (UniqueName: \"kubernetes.io/projected/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b-kube-api-access-cvwm2\") on node \"crc\" DevicePath \"\"" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.892017 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-env-overrides\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.892165 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovnkube-script-lib\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.894677 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovnkube-config\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.895028 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/151869cc-a8d3-4916-9c75-ac9b8c74f942-ovn-node-metrics-cert\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:27 crc kubenswrapper[4801]: I0122 14:15:27.908719 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x55rk\" (UniqueName: \"kubernetes.io/projected/151869cc-a8d3-4916-9c75-ac9b8c74f942-kube-api-access-x55rk\") pod \"ovnkube-node-whjt9\" (UID: \"151869cc-a8d3-4916-9c75-ac9b8c74f942\") " pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.021518 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.615373 4801 generic.go:334] "Generic (PLEG): container finished" podID="151869cc-a8d3-4916-9c75-ac9b8c74f942" containerID="2d63820d706f8fe0c5f81133874e5684f7f89fb93acfeef09ed9b84764b30237" exitCode=0 Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.615473 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerDied","Data":"2d63820d706f8fe0c5f81133874e5684f7f89fb93acfeef09ed9b84764b30237"} Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.615530 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"380cc3200c0ef10c5ef3d28e26b00f134088921fbff8c24edd964c19a1b31350"} Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.618101 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/2.log" Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.621617 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx7sl" Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.660521 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx7sl"] Jan 22 14:15:28 crc kubenswrapper[4801]: I0122 14:15:28.674788 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx7sl"] Jan 22 14:15:29 crc kubenswrapper[4801]: I0122 14:15:29.580027 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b" path="/var/lib/kubelet/pods/33ce5ee6-87c5-4c2a-8ba1-80e12f05a42b/volumes" Jan 22 14:15:29 crc kubenswrapper[4801]: I0122 14:15:29.629080 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"06dd5b8f269f8652286e195636bf25ef3a6dc6911de8718d80a33f9c813ec5bf"} Jan 22 14:15:29 crc kubenswrapper[4801]: I0122 14:15:29.629130 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"5b282eb66f244fb18ad3f99c2c3b5eecdcb1f9be802834faa9ea93def625e3f8"} Jan 22 14:15:30 crc kubenswrapper[4801]: I0122 14:15:30.643965 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"2dedbe3f0a5e160d723f7df9c01f08f6e60eb0bca73791d6f969a8ba091c0795"} Jan 22 14:15:30 crc kubenswrapper[4801]: I0122 14:15:30.644440 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"ea8e47e2b992d49cc5f94372f633d8150a6b998c944dd76283e56f0539f1c897"} Jan 22 14:15:30 crc kubenswrapper[4801]: I0122 14:15:30.644515 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"21ea027105fe1ad0ecc41bfa0f733d3a78981ce41c77d2aa934f569f18765864"} Jan 22 14:15:30 crc kubenswrapper[4801]: I0122 14:15:30.644539 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"5ff78467dc139b4fede916df1510a718d08a4310e4afaddad6bd7a21e80db716"} Jan 22 14:15:32 crc kubenswrapper[4801]: I0122 14:15:32.657392 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"c35aa7bcb094e9da6ca61161cf9384d81bb64ade16bf5de34ca06168584fb6c7"} Jan 22 14:15:33 crc kubenswrapper[4801]: I0122 14:15:33.918835 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dpckc"] Jan 22 14:15:33 crc kubenswrapper[4801]: I0122 14:15:33.919459 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:33 crc kubenswrapper[4801]: I0122 14:15:33.921488 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 22 14:15:33 crc kubenswrapper[4801]: I0122 14:15:33.921297 4801 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-54r7c" Jan 22 14:15:33 crc kubenswrapper[4801]: I0122 14:15:33.921884 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 22 14:15:33 crc kubenswrapper[4801]: I0122 14:15:33.921894 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.021613 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.021680 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.067668 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dabf097f-de22-491e-a357-a2c703d51084-crc-storage\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.067749 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dabf097f-de22-491e-a357-a2c703d51084-node-mnt\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.067774 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl5b\" (UniqueName: \"kubernetes.io/projected/dabf097f-de22-491e-a357-a2c703d51084-kube-api-access-wbl5b\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.169213 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dabf097f-de22-491e-a357-a2c703d51084-node-mnt\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.169293 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl5b\" (UniqueName: \"kubernetes.io/projected/dabf097f-de22-491e-a357-a2c703d51084-kube-api-access-wbl5b\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.169430 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dabf097f-de22-491e-a357-a2c703d51084-crc-storage\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.169763 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dabf097f-de22-491e-a357-a2c703d51084-node-mnt\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.170753 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dabf097f-de22-491e-a357-a2c703d51084-crc-storage\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.194623 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl5b\" (UniqueName: \"kubernetes.io/projected/dabf097f-de22-491e-a357-a2c703d51084-kube-api-access-wbl5b\") pod \"crc-storage-crc-dpckc\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.236062 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: E0122 14:15:34.280758 4801 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(637b2f744ae7bdfd318395419714b21eea9c62eec9a772b04813a7c4f6b2f784): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 22 14:15:34 crc kubenswrapper[4801]: E0122 14:15:34.280911 4801 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(637b2f744ae7bdfd318395419714b21eea9c62eec9a772b04813a7c4f6b2f784): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: E0122 14:15:34.280951 4801 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(637b2f744ae7bdfd318395419714b21eea9c62eec9a772b04813a7c4f6b2f784): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:34 crc kubenswrapper[4801]: E0122 14:15:34.281031 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-dpckc_crc-storage(dabf097f-de22-491e-a357-a2c703d51084)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-dpckc_crc-storage(dabf097f-de22-491e-a357-a2c703d51084)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(637b2f744ae7bdfd318395419714b21eea9c62eec9a772b04813a7c4f6b2f784): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-dpckc" podUID="dabf097f-de22-491e-a357-a2c703d51084" Jan 22 14:15:34 crc kubenswrapper[4801]: I0122 14:15:34.672788 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" event={"ID":"151869cc-a8d3-4916-9c75-ac9b8c74f942","Type":"ContainerStarted","Data":"b240d19c1ce556ac493437a5f0ccab386dad0912d2f7c97a68e08f3f2bc156ef"} Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.265672 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dpckc"] Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.265771 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.266131 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:35 crc kubenswrapper[4801]: E0122 14:15:35.288216 4801 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(09a4d19a07e2b5e2d7eeae89330dde0189d48fe4acfd950614445daa655ed2b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 22 14:15:35 crc kubenswrapper[4801]: E0122 14:15:35.288287 4801 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(09a4d19a07e2b5e2d7eeae89330dde0189d48fe4acfd950614445daa655ed2b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:35 crc kubenswrapper[4801]: E0122 14:15:35.288313 4801 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(09a4d19a07e2b5e2d7eeae89330dde0189d48fe4acfd950614445daa655ed2b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:35 crc kubenswrapper[4801]: E0122 14:15:35.288407 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-dpckc_crc-storage(dabf097f-de22-491e-a357-a2c703d51084)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-dpckc_crc-storage(dabf097f-de22-491e-a357-a2c703d51084)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(09a4d19a07e2b5e2d7eeae89330dde0189d48fe4acfd950614445daa655ed2b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-dpckc" podUID="dabf097f-de22-491e-a357-a2c703d51084" Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.677524 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.677801 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.710413 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:35 crc kubenswrapper[4801]: I0122 14:15:35.712396 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" podStartSLOduration=8.712374006 podStartE2EDuration="8.712374006s" podCreationTimestamp="2026-01-22 14:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:15:35.706711352 +0000 UTC m=+684.408611555" watchObservedRunningTime="2026-01-22 14:15:35.712374006 +0000 UTC m=+684.414274209" Jan 22 14:15:36 crc kubenswrapper[4801]: I0122 14:15:36.684324 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:36 crc kubenswrapper[4801]: I0122 14:15:36.723356 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:15:39 crc kubenswrapper[4801]: I0122 14:15:39.572480 4801 scope.go:117] "RemoveContainer" containerID="c3af53c510e8852d6f71e64a3ff43342641d6b251fd5f757632ac020558b170b" Jan 22 14:15:39 crc kubenswrapper[4801]: E0122 14:15:39.573090 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-b2p9x_openshift-multus(82373306-6578-4229-851f-1d80cdabf2d7)\"" pod="openshift-multus/multus-b2p9x" podUID="82373306-6578-4229-851f-1d80cdabf2d7" Jan 22 14:15:48 crc kubenswrapper[4801]: I0122 14:15:48.570606 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:48 crc kubenswrapper[4801]: I0122 14:15:48.571823 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:48 crc kubenswrapper[4801]: E0122 14:15:48.597847 4801 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(468a1d0662fb7cdebab57104b251e663d2f863759fe5e327ddb22fb8e4fc0c49): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 22 14:15:48 crc kubenswrapper[4801]: E0122 14:15:48.597923 4801 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(468a1d0662fb7cdebab57104b251e663d2f863759fe5e327ddb22fb8e4fc0c49): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:48 crc kubenswrapper[4801]: E0122 14:15:48.597952 4801 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(468a1d0662fb7cdebab57104b251e663d2f863759fe5e327ddb22fb8e4fc0c49): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:15:48 crc kubenswrapper[4801]: E0122 14:15:48.598008 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-dpckc_crc-storage(dabf097f-de22-491e-a357-a2c703d51084)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-dpckc_crc-storage(dabf097f-de22-491e-a357-a2c703d51084)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-dpckc_crc-storage_dabf097f-de22-491e-a357-a2c703d51084_0(468a1d0662fb7cdebab57104b251e663d2f863759fe5e327ddb22fb8e4fc0c49): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-dpckc" podUID="dabf097f-de22-491e-a357-a2c703d51084" Jan 22 14:15:51 crc kubenswrapper[4801]: I0122 14:15:51.574948 4801 scope.go:117] "RemoveContainer" containerID="c3af53c510e8852d6f71e64a3ff43342641d6b251fd5f757632ac020558b170b" Jan 22 14:15:51 crc kubenswrapper[4801]: I0122 14:15:51.766439 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b2p9x_82373306-6578-4229-851f-1d80cdabf2d7/kube-multus/2.log" Jan 22 14:15:51 crc kubenswrapper[4801]: I0122 14:15:51.766515 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b2p9x" event={"ID":"82373306-6578-4229-851f-1d80cdabf2d7","Type":"ContainerStarted","Data":"0b36062f3e4db360222b6af9d53de2efb6ae7957c7876b024e81d63a5a71e775"} Jan 22 14:15:58 crc kubenswrapper[4801]: I0122 14:15:58.054782 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whjt9" Jan 22 14:16:02 crc kubenswrapper[4801]: I0122 14:16:02.570469 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:16:02 crc kubenswrapper[4801]: I0122 14:16:02.571299 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:16:02 crc kubenswrapper[4801]: I0122 14:16:02.769944 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dpckc"] Jan 22 14:16:02 crc kubenswrapper[4801]: I0122 14:16:02.780315 4801 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:16:02 crc kubenswrapper[4801]: I0122 14:16:02.830105 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpckc" event={"ID":"dabf097f-de22-491e-a357-a2c703d51084","Type":"ContainerStarted","Data":"f4b389125fc1b90e1443e5681043cadca99594603e1428bf649f4f4b782e8d39"} Jan 22 14:16:04 crc kubenswrapper[4801]: I0122 14:16:04.021006 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:16:04 crc kubenswrapper[4801]: I0122 14:16:04.021088 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:16:07 crc kubenswrapper[4801]: I0122 14:16:07.857795 4801 generic.go:334] "Generic (PLEG): container finished" podID="dabf097f-de22-491e-a357-a2c703d51084" containerID="83b9f8948f2aef6dbf8cdd19cb46f94ef9a9b67f3b031945d8ef2f4861c06f00" exitCode=0 Jan 22 14:16:07 crc kubenswrapper[4801]: I0122 14:16:07.857917 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpckc" event={"ID":"dabf097f-de22-491e-a357-a2c703d51084","Type":"ContainerDied","Data":"83b9f8948f2aef6dbf8cdd19cb46f94ef9a9b67f3b031945d8ef2f4861c06f00"} Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.169253 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.284988 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dabf097f-de22-491e-a357-a2c703d51084-node-mnt\") pod \"dabf097f-de22-491e-a357-a2c703d51084\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.285086 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dabf097f-de22-491e-a357-a2c703d51084-crc-storage\") pod \"dabf097f-de22-491e-a357-a2c703d51084\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.285132 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dabf097f-de22-491e-a357-a2c703d51084-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "dabf097f-de22-491e-a357-a2c703d51084" (UID: "dabf097f-de22-491e-a357-a2c703d51084"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.285166 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbl5b\" (UniqueName: \"kubernetes.io/projected/dabf097f-de22-491e-a357-a2c703d51084-kube-api-access-wbl5b\") pod \"dabf097f-de22-491e-a357-a2c703d51084\" (UID: \"dabf097f-de22-491e-a357-a2c703d51084\") " Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.285393 4801 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dabf097f-de22-491e-a357-a2c703d51084-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.290050 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabf097f-de22-491e-a357-a2c703d51084-kube-api-access-wbl5b" (OuterVolumeSpecName: "kube-api-access-wbl5b") pod "dabf097f-de22-491e-a357-a2c703d51084" (UID: "dabf097f-de22-491e-a357-a2c703d51084"). InnerVolumeSpecName "kube-api-access-wbl5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.301327 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabf097f-de22-491e-a357-a2c703d51084-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "dabf097f-de22-491e-a357-a2c703d51084" (UID: "dabf097f-de22-491e-a357-a2c703d51084"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.386211 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbl5b\" (UniqueName: \"kubernetes.io/projected/dabf097f-de22-491e-a357-a2c703d51084-kube-api-access-wbl5b\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.386254 4801 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dabf097f-de22-491e-a357-a2c703d51084-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:09 crc kubenswrapper[4801]: E0122 14:16:09.672621 4801 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabf097f_de22_491e_a357_a2c703d51084.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabf097f_de22_491e_a357_a2c703d51084.slice/crio-f4b389125fc1b90e1443e5681043cadca99594603e1428bf649f4f4b782e8d39\": RecentStats: unable to find data in memory cache]" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.880026 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpckc" event={"ID":"dabf097f-de22-491e-a357-a2c703d51084","Type":"ContainerDied","Data":"f4b389125fc1b90e1443e5681043cadca99594603e1428bf649f4f4b782e8d39"} Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.880088 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b389125fc1b90e1443e5681043cadca99594603e1428bf649f4f4b782e8d39" Jan 22 14:16:09 crc kubenswrapper[4801]: I0122 14:16:09.880173 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpckc" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.418081 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw"] Jan 22 14:16:16 crc kubenswrapper[4801]: E0122 14:16:16.418889 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf097f-de22-491e-a357-a2c703d51084" containerName="storage" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.418904 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf097f-de22-491e-a357-a2c703d51084" containerName="storage" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.419029 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabf097f-de22-491e-a357-a2c703d51084" containerName="storage" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.419941 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.429041 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw"] Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.429114 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.488155 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5rpw\" (UniqueName: \"kubernetes.io/projected/6c1ed5aa-48cb-4d1f-8691-6edc756db955-kube-api-access-c5rpw\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.488217 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.488284 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.589607 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.589711 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.589774 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5rpw\" (UniqueName: \"kubernetes.io/projected/6c1ed5aa-48cb-4d1f-8691-6edc756db955-kube-api-access-c5rpw\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.590165 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.590536 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.617024 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5rpw\" (UniqueName: \"kubernetes.io/projected/6c1ed5aa-48cb-4d1f-8691-6edc756db955-kube-api-access-c5rpw\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:16 crc kubenswrapper[4801]: I0122 14:16:16.753963 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:17 crc kubenswrapper[4801]: I0122 14:16:17.145759 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw"] Jan 22 14:16:17 crc kubenswrapper[4801]: I0122 14:16:17.936820 4801 generic.go:334] "Generic (PLEG): container finished" podID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerID="1078a589e9e72875c722f22b322685094071c0ed1a34d3873444a2b7e272a821" exitCode=0 Jan 22 14:16:17 crc kubenswrapper[4801]: I0122 14:16:17.936887 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" event={"ID":"6c1ed5aa-48cb-4d1f-8691-6edc756db955","Type":"ContainerDied","Data":"1078a589e9e72875c722f22b322685094071c0ed1a34d3873444a2b7e272a821"} Jan 22 14:16:17 crc kubenswrapper[4801]: I0122 14:16:17.937193 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" event={"ID":"6c1ed5aa-48cb-4d1f-8691-6edc756db955","Type":"ContainerStarted","Data":"48e7cffde50f2804fdbe457551b2b60a7f085b13f812d3761d670ed32d990e13"} Jan 22 14:16:19 crc kubenswrapper[4801]: I0122 14:16:19.951550 4801 generic.go:334] "Generic (PLEG): container finished" podID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerID="60fd68d36c23ae5c65735d9a442745354bf5686a3dee993e6e10bf824d52febb" exitCode=0 Jan 22 14:16:19 crc kubenswrapper[4801]: I0122 14:16:19.951671 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" event={"ID":"6c1ed5aa-48cb-4d1f-8691-6edc756db955","Type":"ContainerDied","Data":"60fd68d36c23ae5c65735d9a442745354bf5686a3dee993e6e10bf824d52febb"} Jan 22 14:16:20 crc kubenswrapper[4801]: I0122 14:16:20.963388 4801 generic.go:334] "Generic (PLEG): container finished" podID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerID="7ea107b18b9bc5743c4160d61ad88ba8ff5397c2030167177862c63e53e4ea0f" exitCode=0 Jan 22 14:16:20 crc kubenswrapper[4801]: I0122 14:16:20.963479 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" event={"ID":"6c1ed5aa-48cb-4d1f-8691-6edc756db955","Type":"ContainerDied","Data":"7ea107b18b9bc5743c4160d61ad88ba8ff5397c2030167177862c63e53e4ea0f"} Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.250159 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.402684 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-util\") pod \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.402772 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-bundle\") pod \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.402817 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5rpw\" (UniqueName: \"kubernetes.io/projected/6c1ed5aa-48cb-4d1f-8691-6edc756db955-kube-api-access-c5rpw\") pod \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\" (UID: \"6c1ed5aa-48cb-4d1f-8691-6edc756db955\") " Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.403416 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-bundle" (OuterVolumeSpecName: "bundle") pod "6c1ed5aa-48cb-4d1f-8691-6edc756db955" (UID: "6c1ed5aa-48cb-4d1f-8691-6edc756db955"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.403913 4801 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.408148 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1ed5aa-48cb-4d1f-8691-6edc756db955-kube-api-access-c5rpw" (OuterVolumeSpecName: "kube-api-access-c5rpw") pod "6c1ed5aa-48cb-4d1f-8691-6edc756db955" (UID: "6c1ed5aa-48cb-4d1f-8691-6edc756db955"). InnerVolumeSpecName "kube-api-access-c5rpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.423309 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-util" (OuterVolumeSpecName: "util") pod "6c1ed5aa-48cb-4d1f-8691-6edc756db955" (UID: "6c1ed5aa-48cb-4d1f-8691-6edc756db955"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.504655 4801 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c1ed5aa-48cb-4d1f-8691-6edc756db955-util\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.504697 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5rpw\" (UniqueName: \"kubernetes.io/projected/6c1ed5aa-48cb-4d1f-8691-6edc756db955-kube-api-access-c5rpw\") on node \"crc\" DevicePath \"\"" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.978118 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" event={"ID":"6c1ed5aa-48cb-4d1f-8691-6edc756db955","Type":"ContainerDied","Data":"48e7cffde50f2804fdbe457551b2b60a7f085b13f812d3761d670ed32d990e13"} Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.978182 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48e7cffde50f2804fdbe457551b2b60a7f085b13f812d3761d670ed32d990e13" Jan 22 14:16:22 crc kubenswrapper[4801]: I0122 14:16:22.978215 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.227638 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-pg4px"] Jan 22 14:16:25 crc kubenswrapper[4801]: E0122 14:16:25.227876 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="pull" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.227890 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="pull" Jan 22 14:16:25 crc kubenswrapper[4801]: E0122 14:16:25.227900 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="extract" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.227908 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="extract" Jan 22 14:16:25 crc kubenswrapper[4801]: E0122 14:16:25.227925 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="util" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.227933 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="util" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.228074 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1ed5aa-48cb-4d1f-8691-6edc756db955" containerName="extract" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.228501 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.230579 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.230605 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gvbqg" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.230865 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.250919 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-pg4px"] Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.340612 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plttv\" (UniqueName: \"kubernetes.io/projected/c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630-kube-api-access-plttv\") pod \"nmstate-operator-646758c888-pg4px\" (UID: \"c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630\") " pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.441752 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plttv\" (UniqueName: \"kubernetes.io/projected/c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630-kube-api-access-plttv\") pod \"nmstate-operator-646758c888-pg4px\" (UID: \"c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630\") " pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.460870 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plttv\" (UniqueName: \"kubernetes.io/projected/c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630-kube-api-access-plttv\") pod \"nmstate-operator-646758c888-pg4px\" (UID: \"c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630\") " pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.544661 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" Jan 22 14:16:25 crc kubenswrapper[4801]: I0122 14:16:25.991729 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-pg4px"] Jan 22 14:16:27 crc kubenswrapper[4801]: I0122 14:16:27.002442 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" event={"ID":"c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630","Type":"ContainerStarted","Data":"f10215027c67bb7eea07617784b3b3bd961852112b24f90d11ff38104c062a06"} Jan 22 14:16:34 crc kubenswrapper[4801]: I0122 14:16:34.021253 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:16:34 crc kubenswrapper[4801]: I0122 14:16:34.022671 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:16:34 crc kubenswrapper[4801]: I0122 14:16:34.022737 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:16:34 crc kubenswrapper[4801]: I0122 14:16:34.023367 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60a49b0ab9a653bb3640000258ad0704ac97f382b39d4afd913a38b2e74008da"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:16:34 crc kubenswrapper[4801]: I0122 14:16:34.023433 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://60a49b0ab9a653bb3640000258ad0704ac97f382b39d4afd913a38b2e74008da" gracePeriod=600 Jan 22 14:16:35 crc kubenswrapper[4801]: I0122 14:16:35.049137 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="60a49b0ab9a653bb3640000258ad0704ac97f382b39d4afd913a38b2e74008da" exitCode=0 Jan 22 14:16:35 crc kubenswrapper[4801]: I0122 14:16:35.049171 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"60a49b0ab9a653bb3640000258ad0704ac97f382b39d4afd913a38b2e74008da"} Jan 22 14:16:35 crc kubenswrapper[4801]: I0122 14:16:35.049518 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"61d4928d4fe513fa3ba1966842778daa9adc983474924e63b516aff418f3aa6f"} Jan 22 14:16:35 crc kubenswrapper[4801]: I0122 14:16:35.049536 4801 scope.go:117] "RemoveContainer" containerID="2947974c20c12cb11a19ecd994e635ad7fbf51a0a5e93f59ad73397d81838cba" Jan 22 14:16:36 crc kubenswrapper[4801]: I0122 14:16:36.056913 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" event={"ID":"c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630","Type":"ContainerStarted","Data":"9705f25e4daa82fa00c7da6296411fcac4cb3d6d24f6b9e54b21e5657a380616"} Jan 22 14:16:36 crc kubenswrapper[4801]: I0122 14:16:36.079409 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-pg4px" podStartSLOduration=1.271385782 podStartE2EDuration="11.079385457s" podCreationTimestamp="2026-01-22 14:16:25 +0000 UTC" firstStartedPulling="2026-01-22 14:16:26.001684681 +0000 UTC m=+734.703584864" lastFinishedPulling="2026-01-22 14:16:35.809684356 +0000 UTC m=+744.511584539" observedRunningTime="2026-01-22 14:16:36.07396838 +0000 UTC m=+744.775868603" watchObservedRunningTime="2026-01-22 14:16:36.079385457 +0000 UTC m=+744.781285650" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.062246 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-qchm6"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.063524 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.065122 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-x6vfj" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.075958 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-qchm6"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.090912 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.092103 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.095548 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.121193 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-pvj5p"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.130063 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.138475 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.226968 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.227623 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.229559 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6v49p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.229697 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.235687 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.238053 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.263801 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0697bb7e-f3c4-4945-ab46-21e0ae796a8d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rwdj4\" (UID: \"0697bb7e-f3c4-4945-ab46-21e0ae796a8d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.263907 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-ovs-socket\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.263934 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-nmstate-lock\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.263960 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8n85\" (UniqueName: \"kubernetes.io/projected/77f6b218-668c-4315-9ed5-be81d3c0ca42-kube-api-access-d8n85\") pod \"nmstate-metrics-54757c584b-qchm6\" (UID: \"77f6b218-668c-4315-9ed5-be81d3c0ca42\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.264005 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-dbus-socket\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.264042 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29tbr\" (UniqueName: \"kubernetes.io/projected/0697bb7e-f3c4-4945-ab46-21e0ae796a8d-kube-api-access-29tbr\") pod \"nmstate-webhook-8474b5b9d8-rwdj4\" (UID: \"0697bb7e-f3c4-4945-ab46-21e0ae796a8d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.264087 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd962\" (UniqueName: \"kubernetes.io/projected/f54897d6-d02f-4aca-8021-25f78df893d3-kube-api-access-hd962\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.364732 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rpn6\" (UniqueName: \"kubernetes.io/projected/544915b2-6bab-49e2-a670-0bb95ca121e6-kube-api-access-4rpn6\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.364811 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0697bb7e-f3c4-4945-ab46-21e0ae796a8d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rwdj4\" (UID: \"0697bb7e-f3c4-4945-ab46-21e0ae796a8d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.364860 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-ovs-socket\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.364904 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-nmstate-lock\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.364934 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8n85\" (UniqueName: \"kubernetes.io/projected/77f6b218-668c-4315-9ed5-be81d3c0ca42-kube-api-access-d8n85\") pod \"nmstate-metrics-54757c584b-qchm6\" (UID: \"77f6b218-668c-4315-9ed5-be81d3c0ca42\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.364967 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-dbus-socket\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365013 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29tbr\" (UniqueName: \"kubernetes.io/projected/0697bb7e-f3c4-4945-ab46-21e0ae796a8d-kube-api-access-29tbr\") pod \"nmstate-webhook-8474b5b9d8-rwdj4\" (UID: \"0697bb7e-f3c4-4945-ab46-21e0ae796a8d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365076 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/544915b2-6bab-49e2-a670-0bb95ca121e6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365139 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd962\" (UniqueName: \"kubernetes.io/projected/f54897d6-d02f-4aca-8021-25f78df893d3-kube-api-access-hd962\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365157 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-ovs-socket\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365176 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/544915b2-6bab-49e2-a670-0bb95ca121e6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365253 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-nmstate-lock\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.365830 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f54897d6-d02f-4aca-8021-25f78df893d3-dbus-socket\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.375942 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0697bb7e-f3c4-4945-ab46-21e0ae796a8d-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rwdj4\" (UID: \"0697bb7e-f3c4-4945-ab46-21e0ae796a8d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.392304 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd962\" (UniqueName: \"kubernetes.io/projected/f54897d6-d02f-4aca-8021-25f78df893d3-kube-api-access-hd962\") pod \"nmstate-handler-pvj5p\" (UID: \"f54897d6-d02f-4aca-8021-25f78df893d3\") " pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.392780 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8n85\" (UniqueName: \"kubernetes.io/projected/77f6b218-668c-4315-9ed5-be81d3c0ca42-kube-api-access-d8n85\") pod \"nmstate-metrics-54757c584b-qchm6\" (UID: \"77f6b218-668c-4315-9ed5-be81d3c0ca42\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.393252 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29tbr\" (UniqueName: \"kubernetes.io/projected/0697bb7e-f3c4-4945-ab46-21e0ae796a8d-kube-api-access-29tbr\") pod \"nmstate-webhook-8474b5b9d8-rwdj4\" (UID: \"0697bb7e-f3c4-4945-ab46-21e0ae796a8d\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.414981 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.423937 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.431627 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54856c68c6-xd2f2"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.434626 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.453510 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.459991 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54856c68c6-xd2f2"] Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.469078 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/544915b2-6bab-49e2-a670-0bb95ca121e6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.469313 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/544915b2-6bab-49e2-a670-0bb95ca121e6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.469354 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rpn6\" (UniqueName: \"kubernetes.io/projected/544915b2-6bab-49e2-a670-0bb95ca121e6-kube-api-access-4rpn6\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.470316 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/544915b2-6bab-49e2-a670-0bb95ca121e6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.474006 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/544915b2-6bab-49e2-a670-0bb95ca121e6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.487206 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rpn6\" (UniqueName: \"kubernetes.io/projected/544915b2-6bab-49e2-a670-0bb95ca121e6-kube-api-access-4rpn6\") pod \"nmstate-console-plugin-7754f76f8b-2qkpv\" (UID: \"544915b2-6bab-49e2-a670-0bb95ca121e6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: W0122 14:16:37.495324 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf54897d6_d02f_4aca_8021_25f78df893d3.slice/crio-9b733d6893cb91bddd60c9a5d85043565e8946ccf2116847de148e2bed96a6d7 WatchSource:0}: Error finding container 9b733d6893cb91bddd60c9a5d85043565e8946ccf2116847de148e2bed96a6d7: Status 404 returned error can't find the container with id 9b733d6893cb91bddd60c9a5d85043565e8946ccf2116847de148e2bed96a6d7 Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.545121 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.569930 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-service-ca\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.570007 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-config\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.570142 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-trusted-ca-bundle\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.570214 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-serving-cert\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.570276 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-oauth-serving-cert\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.570303 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-oauth-config\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.570333 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf7c6\" (UniqueName: \"kubernetes.io/projected/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-kube-api-access-sf7c6\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.648845 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-qchm6"] Jan 22 14:16:37 crc kubenswrapper[4801]: W0122 14:16:37.656909 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f6b218_668c_4315_9ed5_be81d3c0ca42.slice/crio-ef890e76c379c6ed4a4adc713f73f2748d4f159e046f16ef7cf80c3f48a1a2ee WatchSource:0}: Error finding container ef890e76c379c6ed4a4adc713f73f2748d4f159e046f16ef7cf80c3f48a1a2ee: Status 404 returned error can't find the container with id ef890e76c379c6ed4a4adc713f73f2748d4f159e046f16ef7cf80c3f48a1a2ee Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671163 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-trusted-ca-bundle\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671201 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-serving-cert\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671236 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-oauth-serving-cert\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671254 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-oauth-config\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671270 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf7c6\" (UniqueName: \"kubernetes.io/projected/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-kube-api-access-sf7c6\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671308 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-service-ca\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.671328 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-config\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.672894 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-service-ca\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.672932 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-config\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.672934 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-oauth-serving-cert\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.673596 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-trusted-ca-bundle\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.676292 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-oauth-config\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.680766 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-console-serving-cert\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.693422 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf7c6\" (UniqueName: \"kubernetes.io/projected/8eb62272-c919-4f75-8a5e-3b4eb4ebef6a-kube-api-access-sf7c6\") pod \"console-54856c68c6-xd2f2\" (UID: \"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a\") " pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.699511 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4"] Jan 22 14:16:37 crc kubenswrapper[4801]: W0122 14:16:37.704979 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0697bb7e_f3c4_4945_ab46_21e0ae796a8d.slice/crio-44085c33c0d564a5c45f407d39042d6b6521d7f14586a91790e57670bcad8875 WatchSource:0}: Error finding container 44085c33c0d564a5c45f407d39042d6b6521d7f14586a91790e57670bcad8875: Status 404 returned error can't find the container with id 44085c33c0d564a5c45f407d39042d6b6521d7f14586a91790e57670bcad8875 Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.736090 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv"] Jan 22 14:16:37 crc kubenswrapper[4801]: W0122 14:16:37.739982 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod544915b2_6bab_49e2_a670_0bb95ca121e6.slice/crio-be2c01fa09841cb6536ae954794b5737bef4eb35f5e52ca23fb6324a67846bbd WatchSource:0}: Error finding container be2c01fa09841cb6536ae954794b5737bef4eb35f5e52ca23fb6324a67846bbd: Status 404 returned error can't find the container with id be2c01fa09841cb6536ae954794b5737bef4eb35f5e52ca23fb6324a67846bbd Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.790543 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:37 crc kubenswrapper[4801]: I0122 14:16:37.986038 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54856c68c6-xd2f2"] Jan 22 14:16:37 crc kubenswrapper[4801]: W0122 14:16:37.989648 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eb62272_c919_4f75_8a5e_3b4eb4ebef6a.slice/crio-dab58f4e19d53b01cb1c392346130e1d531a46c0ec664eabab044c3190d86918 WatchSource:0}: Error finding container dab58f4e19d53b01cb1c392346130e1d531a46c0ec664eabab044c3190d86918: Status 404 returned error can't find the container with id dab58f4e19d53b01cb1c392346130e1d531a46c0ec664eabab044c3190d86918 Jan 22 14:16:38 crc kubenswrapper[4801]: I0122 14:16:38.076295 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54856c68c6-xd2f2" event={"ID":"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a","Type":"ContainerStarted","Data":"dab58f4e19d53b01cb1c392346130e1d531a46c0ec664eabab044c3190d86918"} Jan 22 14:16:38 crc kubenswrapper[4801]: I0122 14:16:38.077133 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" event={"ID":"0697bb7e-f3c4-4945-ab46-21e0ae796a8d","Type":"ContainerStarted","Data":"44085c33c0d564a5c45f407d39042d6b6521d7f14586a91790e57670bcad8875"} Jan 22 14:16:38 crc kubenswrapper[4801]: I0122 14:16:38.078187 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" event={"ID":"77f6b218-668c-4315-9ed5-be81d3c0ca42","Type":"ContainerStarted","Data":"ef890e76c379c6ed4a4adc713f73f2748d4f159e046f16ef7cf80c3f48a1a2ee"} Jan 22 14:16:38 crc kubenswrapper[4801]: I0122 14:16:38.079265 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" event={"ID":"544915b2-6bab-49e2-a670-0bb95ca121e6","Type":"ContainerStarted","Data":"be2c01fa09841cb6536ae954794b5737bef4eb35f5e52ca23fb6324a67846bbd"} Jan 22 14:16:38 crc kubenswrapper[4801]: I0122 14:16:38.080098 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pvj5p" event={"ID":"f54897d6-d02f-4aca-8021-25f78df893d3","Type":"ContainerStarted","Data":"9b733d6893cb91bddd60c9a5d85043565e8946ccf2116847de148e2bed96a6d7"} Jan 22 14:16:39 crc kubenswrapper[4801]: I0122 14:16:39.089754 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54856c68c6-xd2f2" event={"ID":"8eb62272-c919-4f75-8a5e-3b4eb4ebef6a","Type":"ContainerStarted","Data":"1f4d661c6fe9a630979476e6eefac51ae65780feff31bea6d1d9f712ac7587d5"} Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.102323 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pvj5p" event={"ID":"f54897d6-d02f-4aca-8021-25f78df893d3","Type":"ContainerStarted","Data":"82afdb79d4952bb33273f544331d468f5845a469953a5ddd87b899af6a6c5578"} Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.103299 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.104159 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" event={"ID":"0697bb7e-f3c4-4945-ab46-21e0ae796a8d","Type":"ContainerStarted","Data":"7c4f86c598cd081481749790d9db72620fb242a3f9f6697af6e207b6fce0773b"} Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.104594 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.106413 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" event={"ID":"77f6b218-668c-4315-9ed5-be81d3c0ca42","Type":"ContainerStarted","Data":"70c16a2641668ced5725f0b2526cb03c58654047d9fdc7cab99e3e514f4ecc84"} Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.107327 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" event={"ID":"544915b2-6bab-49e2-a670-0bb95ca121e6","Type":"ContainerStarted","Data":"388a9ac1f90bc789cda0e9c5d255673df747df32a60a3520cc72c96231ddf88e"} Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.120639 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54856c68c6-xd2f2" podStartSLOduration=4.120620687 podStartE2EDuration="4.120620687s" podCreationTimestamp="2026-01-22 14:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:16:39.108149404 +0000 UTC m=+747.810049587" watchObservedRunningTime="2026-01-22 14:16:41.120620687 +0000 UTC m=+749.822520890" Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.136299 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-pvj5p" podStartSLOduration=1.380620625 podStartE2EDuration="4.136275361s" podCreationTimestamp="2026-01-22 14:16:37 +0000 UTC" firstStartedPulling="2026-01-22 14:16:37.497325738 +0000 UTC m=+746.199225921" lastFinishedPulling="2026-01-22 14:16:40.252980434 +0000 UTC m=+748.954880657" observedRunningTime="2026-01-22 14:16:41.119950057 +0000 UTC m=+749.821850250" watchObservedRunningTime="2026-01-22 14:16:41.136275361 +0000 UTC m=+749.838175554" Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.137632 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2qkpv" podStartSLOduration=1.645087845 podStartE2EDuration="4.13762303s" podCreationTimestamp="2026-01-22 14:16:37 +0000 UTC" firstStartedPulling="2026-01-22 14:16:37.741821399 +0000 UTC m=+746.443721582" lastFinishedPulling="2026-01-22 14:16:40.234356584 +0000 UTC m=+748.936256767" observedRunningTime="2026-01-22 14:16:41.131178343 +0000 UTC m=+749.833078536" watchObservedRunningTime="2026-01-22 14:16:41.13762303 +0000 UTC m=+749.839523213" Jan 22 14:16:41 crc kubenswrapper[4801]: I0122 14:16:41.148057 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" podStartSLOduration=1.6009075830000001 podStartE2EDuration="4.148039832s" podCreationTimestamp="2026-01-22 14:16:37 +0000 UTC" firstStartedPulling="2026-01-22 14:16:37.706873205 +0000 UTC m=+746.408773378" lastFinishedPulling="2026-01-22 14:16:40.254005444 +0000 UTC m=+748.955905627" observedRunningTime="2026-01-22 14:16:41.147379603 +0000 UTC m=+749.849279786" watchObservedRunningTime="2026-01-22 14:16:41.148039832 +0000 UTC m=+749.849940015" Jan 22 14:16:43 crc kubenswrapper[4801]: I0122 14:16:43.125183 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" event={"ID":"77f6b218-668c-4315-9ed5-be81d3c0ca42","Type":"ContainerStarted","Data":"251564dd5c82dea748be9fc95d34db7a92aaf8314ee1e3a03a228c6d387ec73a"} Jan 22 14:16:45 crc kubenswrapper[4801]: I0122 14:16:45.620502 4801 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 14:16:47 crc kubenswrapper[4801]: I0122 14:16:47.481709 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-pvj5p" Jan 22 14:16:47 crc kubenswrapper[4801]: I0122 14:16:47.501672 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-qchm6" podStartSLOduration=5.525136599 podStartE2EDuration="10.501645372s" podCreationTimestamp="2026-01-22 14:16:37 +0000 UTC" firstStartedPulling="2026-01-22 14:16:37.66013469 +0000 UTC m=+746.362034863" lastFinishedPulling="2026-01-22 14:16:42.636643453 +0000 UTC m=+751.338543636" observedRunningTime="2026-01-22 14:16:43.148865978 +0000 UTC m=+751.850766201" watchObservedRunningTime="2026-01-22 14:16:47.501645372 +0000 UTC m=+756.203545585" Jan 22 14:16:47 crc kubenswrapper[4801]: I0122 14:16:47.790691 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:47 crc kubenswrapper[4801]: I0122 14:16:47.790753 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:47 crc kubenswrapper[4801]: I0122 14:16:47.794610 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:48 crc kubenswrapper[4801]: I0122 14:16:48.167857 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54856c68c6-xd2f2" Jan 22 14:16:48 crc kubenswrapper[4801]: I0122 14:16:48.248593 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nbhfx"] Jan 22 14:16:57 crc kubenswrapper[4801]: I0122 14:16:57.430308 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rwdj4" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.378628 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj"] Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.380535 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.383272 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.394575 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj"] Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.568889 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.568998 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbs5l\" (UniqueName: \"kubernetes.io/projected/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-kube-api-access-wbs5l\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.569126 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.671413 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.671559 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbs5l\" (UniqueName: \"kubernetes.io/projected/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-kube-api-access-wbs5l\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.671655 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.672963 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.673062 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.705373 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbs5l\" (UniqueName: \"kubernetes.io/projected/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-kube-api-access-wbs5l\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:10 crc kubenswrapper[4801]: I0122 14:17:10.709844 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:11 crc kubenswrapper[4801]: I0122 14:17:11.434059 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj"] Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.310485 4801 generic.go:334] "Generic (PLEG): container finished" podID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerID="fc9d11d06038677e8349ab1e401a5d857cbdb83cba047a975c0db7f77c225c5c" exitCode=0 Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.310578 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" event={"ID":"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4","Type":"ContainerDied","Data":"fc9d11d06038677e8349ab1e401a5d857cbdb83cba047a975c0db7f77c225c5c"} Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.311182 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" event={"ID":"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4","Type":"ContainerStarted","Data":"f63cd1e078cda75f418a3e2e94173175cc847029ebed4dc0c308983b01d8af09"} Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.720029 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l4bqv"] Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.721371 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.736124 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4bqv"] Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.897284 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-catalog-content\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.897373 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtbh\" (UniqueName: \"kubernetes.io/projected/b0114a52-8099-4a35-a0b7-73458412c5da-kube-api-access-trtbh\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.897406 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-utilities\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.998639 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-catalog-content\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.998705 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trtbh\" (UniqueName: \"kubernetes.io/projected/b0114a52-8099-4a35-a0b7-73458412c5da-kube-api-access-trtbh\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.998738 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-utilities\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.999222 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-catalog-content\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:12 crc kubenswrapper[4801]: I0122 14:17:12.999287 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-utilities\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.020518 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trtbh\" (UniqueName: \"kubernetes.io/projected/b0114a52-8099-4a35-a0b7-73458412c5da-kube-api-access-trtbh\") pod \"redhat-operators-l4bqv\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.050935 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.451749 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nbhfx" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerName="console" containerID="cri-o://747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6" gracePeriod=15 Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.622508 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4bqv"] Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.788708 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nbhfx_c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94/console/0.log" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.788958 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.910855 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-config\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.910970 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-service-ca\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.910996 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9zf\" (UniqueName: \"kubernetes.io/projected/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-kube-api-access-wl9zf\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.911594 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-serving-cert\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.911640 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-oauth-serving-cert\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.911664 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-oauth-config\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.911686 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-trusted-ca-bundle\") pod \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\" (UID: \"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94\") " Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.911901 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-config" (OuterVolumeSpecName: "console-config") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.911923 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-service-ca" (OuterVolumeSpecName: "service-ca") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.912167 4801 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.912181 4801 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.912280 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.912423 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.919640 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-kube-api-access-wl9zf" (OuterVolumeSpecName: "kube-api-access-wl9zf") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "kube-api-access-wl9zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.919678 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:17:13 crc kubenswrapper[4801]: I0122 14:17:13.919963 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" (UID: "c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.013268 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9zf\" (UniqueName: \"kubernetes.io/projected/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-kube-api-access-wl9zf\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.013316 4801 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.013329 4801 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.013342 4801 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.013354 4801 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.322778 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nbhfx_c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94/console/0.log" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.322838 4801 generic.go:334] "Generic (PLEG): container finished" podID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerID="747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6" exitCode=2 Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.322915 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbhfx" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.322917 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbhfx" event={"ID":"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94","Type":"ContainerDied","Data":"747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6"} Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.323054 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbhfx" event={"ID":"c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94","Type":"ContainerDied","Data":"bbf97dc8a8e0c027dd6b95968532c6ac2563ec0a54b7c3eccc41d23c0f30233d"} Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.323078 4801 scope.go:117] "RemoveContainer" containerID="747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.324770 4801 generic.go:334] "Generic (PLEG): container finished" podID="b0114a52-8099-4a35-a0b7-73458412c5da" containerID="cb1dfcba569b1bc909c58c4d483a009d421873d453e2db42b2176df7865ae663" exitCode=0 Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.324832 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerDied","Data":"cb1dfcba569b1bc909c58c4d483a009d421873d453e2db42b2176df7865ae663"} Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.324862 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerStarted","Data":"c5c2b70e622f4384e7d311af8bc5178eb364798d84757d01f190b19d8a5c691b"} Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.328361 4801 generic.go:334] "Generic (PLEG): container finished" podID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerID="d9ec08a1811b4795a27924736a77816b69a204a6b634e380dbb3f8fee3f4be1e" exitCode=0 Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.328403 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" event={"ID":"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4","Type":"ContainerDied","Data":"d9ec08a1811b4795a27924736a77816b69a204a6b634e380dbb3f8fee3f4be1e"} Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.340168 4801 scope.go:117] "RemoveContainer" containerID="747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6" Jan 22 14:17:14 crc kubenswrapper[4801]: E0122 14:17:14.340675 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6\": container with ID starting with 747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6 not found: ID does not exist" containerID="747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.340718 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6"} err="failed to get container status \"747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6\": rpc error: code = NotFound desc = could not find container \"747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6\": container with ID starting with 747a699ec7ccb2cdaf5ecb8785dee963781b1baf4fbdb66fe52d4fcb3ac2afd6 not found: ID does not exist" Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.392558 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nbhfx"] Jan 22 14:17:14 crc kubenswrapper[4801]: I0122 14:17:14.393188 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nbhfx"] Jan 22 14:17:15 crc kubenswrapper[4801]: I0122 14:17:15.348210 4801 generic.go:334] "Generic (PLEG): container finished" podID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerID="b845e9bf6aeaf7bb5289983bdb9a1af0d31c0e71eb1791447c946883e312d5a0" exitCode=0 Jan 22 14:17:15 crc kubenswrapper[4801]: I0122 14:17:15.348374 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" event={"ID":"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4","Type":"ContainerDied","Data":"b845e9bf6aeaf7bb5289983bdb9a1af0d31c0e71eb1791447c946883e312d5a0"} Jan 22 14:17:15 crc kubenswrapper[4801]: I0122 14:17:15.740597 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" path="/var/lib/kubelet/pods/c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94/volumes" Jan 22 14:17:16 crc kubenswrapper[4801]: I0122 14:17:16.357493 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerStarted","Data":"94a1e58598ef7ff80f02b6ecdcda5164bc1ebc1ac2bc1489b1d496ce58f40055"} Jan 22 14:17:16 crc kubenswrapper[4801]: I0122 14:17:16.983239 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.151224 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbs5l\" (UniqueName: \"kubernetes.io/projected/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-kube-api-access-wbs5l\") pod \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.151372 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-util\") pod \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.151414 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-bundle\") pod \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\" (UID: \"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4\") " Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.152399 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-bundle" (OuterVolumeSpecName: "bundle") pod "1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" (UID: "1a18dfb8-3d0d-4014-97ff-a3d6d77736d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.161638 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-util" (OuterVolumeSpecName: "util") pod "1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" (UID: "1a18dfb8-3d0d-4014-97ff-a3d6d77736d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.237903 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-kube-api-access-wbs5l" (OuterVolumeSpecName: "kube-api-access-wbs5l") pod "1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" (UID: "1a18dfb8-3d0d-4014-97ff-a3d6d77736d4"). InnerVolumeSpecName "kube-api-access-wbs5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.252800 4801 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-util\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.252836 4801 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.252848 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbs5l\" (UniqueName: \"kubernetes.io/projected/1a18dfb8-3d0d-4014-97ff-a3d6d77736d4-kube-api-access-wbs5l\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.366286 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" event={"ID":"1a18dfb8-3d0d-4014-97ff-a3d6d77736d4","Type":"ContainerDied","Data":"f63cd1e078cda75f418a3e2e94173175cc847029ebed4dc0c308983b01d8af09"} Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.366317 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj" Jan 22 14:17:17 crc kubenswrapper[4801]: I0122 14:17:17.366530 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f63cd1e078cda75f418a3e2e94173175cc847029ebed4dc0c308983b01d8af09" Jan 22 14:17:18 crc kubenswrapper[4801]: I0122 14:17:18.376621 4801 generic.go:334] "Generic (PLEG): container finished" podID="b0114a52-8099-4a35-a0b7-73458412c5da" containerID="94a1e58598ef7ff80f02b6ecdcda5164bc1ebc1ac2bc1489b1d496ce58f40055" exitCode=0 Jan 22 14:17:18 crc kubenswrapper[4801]: I0122 14:17:18.376662 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerDied","Data":"94a1e58598ef7ff80f02b6ecdcda5164bc1ebc1ac2bc1489b1d496ce58f40055"} Jan 22 14:17:19 crc kubenswrapper[4801]: I0122 14:17:19.382500 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerStarted","Data":"b587c3a4840320cd1ad3a37fc98a703c4087788950f259e5ce289c78e944ec5c"} Jan 22 14:17:19 crc kubenswrapper[4801]: I0122 14:17:19.399356 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l4bqv" podStartSLOduration=2.906003147 podStartE2EDuration="7.399341707s" podCreationTimestamp="2026-01-22 14:17:12 +0000 UTC" firstStartedPulling="2026-01-22 14:17:14.326384905 +0000 UTC m=+783.028285088" lastFinishedPulling="2026-01-22 14:17:18.819723455 +0000 UTC m=+787.521623648" observedRunningTime="2026-01-22 14:17:19.396794845 +0000 UTC m=+788.098695048" watchObservedRunningTime="2026-01-22 14:17:19.399341707 +0000 UTC m=+788.101241890" Jan 22 14:17:23 crc kubenswrapper[4801]: I0122 14:17:23.052427 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:23 crc kubenswrapper[4801]: I0122 14:17:23.053533 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:24 crc kubenswrapper[4801]: I0122 14:17:24.150710 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4bqv" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="registry-server" probeResult="failure" output=< Jan 22 14:17:24 crc kubenswrapper[4801]: timeout: failed to connect service ":50051" within 1s Jan 22 14:17:24 crc kubenswrapper[4801]: > Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664589 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-66cd969-xk8wl"] Jan 22 14:17:26 crc kubenswrapper[4801]: E0122 14:17:26.664801 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerName="console" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664813 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerName="console" Jan 22 14:17:26 crc kubenswrapper[4801]: E0122 14:17:26.664824 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="util" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664829 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="util" Jan 22 14:17:26 crc kubenswrapper[4801]: E0122 14:17:26.664846 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="extract" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664867 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="extract" Jan 22 14:17:26 crc kubenswrapper[4801]: E0122 14:17:26.664877 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="pull" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664883 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="pull" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664974 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="c477f6d9-e4b6-4a2d-bfd2-e54d5039ec94" containerName="console" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.664988 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a18dfb8-3d0d-4014-97ff-a3d6d77736d4" containerName="extract" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.665365 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.667500 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.667596 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-j8zww" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.668007 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.669100 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.677324 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.685692 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-66cd969-xk8wl"] Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.742755 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krd8b\" (UniqueName: \"kubernetes.io/projected/02287e96-efd1-40d3-a38c-b3e5eed73386-kube-api-access-krd8b\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.742853 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02287e96-efd1-40d3-a38c-b3e5eed73386-webhook-cert\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.742882 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02287e96-efd1-40d3-a38c-b3e5eed73386-apiservice-cert\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.843955 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krd8b\" (UniqueName: \"kubernetes.io/projected/02287e96-efd1-40d3-a38c-b3e5eed73386-kube-api-access-krd8b\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.844088 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02287e96-efd1-40d3-a38c-b3e5eed73386-webhook-cert\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.844124 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02287e96-efd1-40d3-a38c-b3e5eed73386-apiservice-cert\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.850547 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02287e96-efd1-40d3-a38c-b3e5eed73386-apiservice-cert\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.851106 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02287e96-efd1-40d3-a38c-b3e5eed73386-webhook-cert\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.872255 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krd8b\" (UniqueName: \"kubernetes.io/projected/02287e96-efd1-40d3-a38c-b3e5eed73386-kube-api-access-krd8b\") pod \"metallb-operator-controller-manager-66cd969-xk8wl\" (UID: \"02287e96-efd1-40d3-a38c-b3e5eed73386\") " pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.931012 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q"] Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.931770 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.934053 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.934883 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qphbh" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.945127 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwv9b\" (UniqueName: \"kubernetes.io/projected/e19728a3-4d6b-40d4-bb06-4f715a4ac345-kube-api-access-gwv9b\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.945213 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e19728a3-4d6b-40d4-bb06-4f715a4ac345-apiservice-cert\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.945259 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e19728a3-4d6b-40d4-bb06-4f715a4ac345-webhook-cert\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.947653 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.954138 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q"] Jan 22 14:17:26 crc kubenswrapper[4801]: I0122 14:17:26.982710 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.046829 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwv9b\" (UniqueName: \"kubernetes.io/projected/e19728a3-4d6b-40d4-bb06-4f715a4ac345-kube-api-access-gwv9b\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.046896 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e19728a3-4d6b-40d4-bb06-4f715a4ac345-apiservice-cert\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.046926 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e19728a3-4d6b-40d4-bb06-4f715a4ac345-webhook-cert\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.053981 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e19728a3-4d6b-40d4-bb06-4f715a4ac345-webhook-cert\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.054244 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e19728a3-4d6b-40d4-bb06-4f715a4ac345-apiservice-cert\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.061751 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwv9b\" (UniqueName: \"kubernetes.io/projected/e19728a3-4d6b-40d4-bb06-4f715a4ac345-kube-api-access-gwv9b\") pod \"metallb-operator-webhook-server-7b5b556888-hxr6q\" (UID: \"e19728a3-4d6b-40d4-bb06-4f715a4ac345\") " pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.247827 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.550763 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-66cd969-xk8wl"] Jan 22 14:17:27 crc kubenswrapper[4801]: I0122 14:17:27.829035 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q"] Jan 22 14:17:27 crc kubenswrapper[4801]: W0122 14:17:27.829593 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19728a3_4d6b_40d4_bb06_4f715a4ac345.slice/crio-73a0967be27a2ecd9e5ce0324027bba753f0caba96917fec6302c1116b8ad2eb WatchSource:0}: Error finding container 73a0967be27a2ecd9e5ce0324027bba753f0caba96917fec6302c1116b8ad2eb: Status 404 returned error can't find the container with id 73a0967be27a2ecd9e5ce0324027bba753f0caba96917fec6302c1116b8ad2eb Jan 22 14:17:28 crc kubenswrapper[4801]: I0122 14:17:28.436793 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" event={"ID":"02287e96-efd1-40d3-a38c-b3e5eed73386","Type":"ContainerStarted","Data":"5a03dfb3aec9416c1589af991c3a59e99979e15a7a22337b0d9001b1b86ea399"} Jan 22 14:17:28 crc kubenswrapper[4801]: I0122 14:17:28.438302 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" event={"ID":"e19728a3-4d6b-40d4-bb06-4f715a4ac345","Type":"ContainerStarted","Data":"73a0967be27a2ecd9e5ce0324027bba753f0caba96917fec6302c1116b8ad2eb"} Jan 22 14:17:32 crc kubenswrapper[4801]: I0122 14:17:32.470826 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" event={"ID":"02287e96-efd1-40d3-a38c-b3e5eed73386","Type":"ContainerStarted","Data":"35927972fa974d08282d8580ee58eba544edd9dafce4a2a98a51784649a52a20"} Jan 22 14:17:32 crc kubenswrapper[4801]: I0122 14:17:32.471427 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:17:32 crc kubenswrapper[4801]: I0122 14:17:32.491040 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" podStartSLOduration=2.159501396 podStartE2EDuration="6.491018095s" podCreationTimestamp="2026-01-22 14:17:26 +0000 UTC" firstStartedPulling="2026-01-22 14:17:27.579656096 +0000 UTC m=+796.281556279" lastFinishedPulling="2026-01-22 14:17:31.911172795 +0000 UTC m=+800.613072978" observedRunningTime="2026-01-22 14:17:32.487109214 +0000 UTC m=+801.189009397" watchObservedRunningTime="2026-01-22 14:17:32.491018095 +0000 UTC m=+801.192918298" Jan 22 14:17:33 crc kubenswrapper[4801]: I0122 14:17:33.099503 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:33 crc kubenswrapper[4801]: I0122 14:17:33.152082 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:35 crc kubenswrapper[4801]: I0122 14:17:35.112548 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4bqv"] Jan 22 14:17:35 crc kubenswrapper[4801]: I0122 14:17:35.113848 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l4bqv" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="registry-server" containerID="cri-o://b587c3a4840320cd1ad3a37fc98a703c4087788950f259e5ce289c78e944ec5c" gracePeriod=2 Jan 22 14:17:35 crc kubenswrapper[4801]: I0122 14:17:35.492002 4801 generic.go:334] "Generic (PLEG): container finished" podID="b0114a52-8099-4a35-a0b7-73458412c5da" containerID="b587c3a4840320cd1ad3a37fc98a703c4087788950f259e5ce289c78e944ec5c" exitCode=0 Jan 22 14:17:35 crc kubenswrapper[4801]: I0122 14:17:35.492039 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerDied","Data":"b587c3a4840320cd1ad3a37fc98a703c4087788950f259e5ce289c78e944ec5c"} Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.229121 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.234962 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-utilities\") pod \"b0114a52-8099-4a35-a0b7-73458412c5da\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.235044 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-catalog-content\") pod \"b0114a52-8099-4a35-a0b7-73458412c5da\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.235148 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trtbh\" (UniqueName: \"kubernetes.io/projected/b0114a52-8099-4a35-a0b7-73458412c5da-kube-api-access-trtbh\") pod \"b0114a52-8099-4a35-a0b7-73458412c5da\" (UID: \"b0114a52-8099-4a35-a0b7-73458412c5da\") " Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.236129 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-utilities" (OuterVolumeSpecName: "utilities") pod "b0114a52-8099-4a35-a0b7-73458412c5da" (UID: "b0114a52-8099-4a35-a0b7-73458412c5da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.246179 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0114a52-8099-4a35-a0b7-73458412c5da-kube-api-access-trtbh" (OuterVolumeSpecName: "kube-api-access-trtbh") pod "b0114a52-8099-4a35-a0b7-73458412c5da" (UID: "b0114a52-8099-4a35-a0b7-73458412c5da"). InnerVolumeSpecName "kube-api-access-trtbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.336893 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trtbh\" (UniqueName: \"kubernetes.io/projected/b0114a52-8099-4a35-a0b7-73458412c5da-kube-api-access-trtbh\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.336937 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.389063 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0114a52-8099-4a35-a0b7-73458412c5da" (UID: "b0114a52-8099-4a35-a0b7-73458412c5da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.437950 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0114a52-8099-4a35-a0b7-73458412c5da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.506038 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4bqv" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.507124 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4bqv" event={"ID":"b0114a52-8099-4a35-a0b7-73458412c5da","Type":"ContainerDied","Data":"c5c2b70e622f4384e7d311af8bc5178eb364798d84757d01f190b19d8a5c691b"} Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.507229 4801 scope.go:117] "RemoveContainer" containerID="b587c3a4840320cd1ad3a37fc98a703c4087788950f259e5ce289c78e944ec5c" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.509043 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" event={"ID":"e19728a3-4d6b-40d4-bb06-4f715a4ac345","Type":"ContainerStarted","Data":"24046052e24c1016a338679e9f093aa511c1cb270a650116eaa518b92eebcff4"} Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.509492 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.525544 4801 scope.go:117] "RemoveContainer" containerID="94a1e58598ef7ff80f02b6ecdcda5164bc1ebc1ac2bc1489b1d496ce58f40055" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.544999 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" podStartSLOduration=2.27963382 podStartE2EDuration="11.544979327s" podCreationTimestamp="2026-01-22 14:17:26 +0000 UTC" firstStartedPulling="2026-01-22 14:17:27.832791451 +0000 UTC m=+796.534691634" lastFinishedPulling="2026-01-22 14:17:37.098136958 +0000 UTC m=+805.800037141" observedRunningTime="2026-01-22 14:17:37.536326801 +0000 UTC m=+806.238227004" watchObservedRunningTime="2026-01-22 14:17:37.544979327 +0000 UTC m=+806.246879510" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.559230 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4bqv"] Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.560942 4801 scope.go:117] "RemoveContainer" containerID="cb1dfcba569b1bc909c58c4d483a009d421873d453e2db42b2176df7865ae663" Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.565971 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l4bqv"] Jan 22 14:17:37 crc kubenswrapper[4801]: I0122 14:17:37.582221 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" path="/var/lib/kubelet/pods/b0114a52-8099-4a35-a0b7-73458412c5da/volumes" Jan 22 14:17:47 crc kubenswrapper[4801]: I0122 14:17:47.252536 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b5b556888-hxr6q" Jan 22 14:18:06 crc kubenswrapper[4801]: I0122 14:18:06.985682 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-66cd969-xk8wl" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.651293 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dvff6"] Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.651893 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="extract-content" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.651911 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="extract-content" Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.651924 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="registry-server" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.651932 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="registry-server" Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.651942 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="extract-utilities" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.651948 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="extract-utilities" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.652064 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0114a52-8099-4a35-a0b7-73458412c5da" containerName="registry-server" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.654235 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.658311 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.658382 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n4lnd" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.658557 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.663965 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr"] Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.665030 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.670333 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.689624 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr"] Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.754215 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-46zlw"] Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.755220 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.758536 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.759190 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.759432 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kzcmk" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.759887 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.772405 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-fln4v"] Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.773534 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.778862 4801 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.782851 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-conf\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783049 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-sockets\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783130 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-startup\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783234 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2789e45f-da51-4236-a35b-bce9acd65d2c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-s2qlr\" (UID: \"2789e45f-da51-4236-a35b-bce9acd65d2c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783319 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79cl\" (UniqueName: \"kubernetes.io/projected/2789e45f-da51-4236-a35b-bce9acd65d2c-kube-api-access-p79cl\") pod \"frr-k8s-webhook-server-7df86c4f6c-s2qlr\" (UID: \"2789e45f-da51-4236-a35b-bce9acd65d2c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783405 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-reloader\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783514 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-metrics-certs\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783635 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-metrics\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.783744 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z2d8\" (UniqueName: \"kubernetes.io/projected/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-kube-api-access-6z2d8\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.789879 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-fln4v"] Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884759 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z2d8\" (UniqueName: \"kubernetes.io/projected/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-kube-api-access-6z2d8\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884826 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884851 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-metallb-excludel2\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884893 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-conf\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884922 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-sockets\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884944 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-startup\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.884986 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4f7\" (UniqueName: \"kubernetes.io/projected/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-kube-api-access-zq4f7\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885014 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-metrics-certs\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885042 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6vx\" (UniqueName: \"kubernetes.io/projected/4804b763-eb34-47b5-958c-dd672ac9a5be-kube-api-access-mz6vx\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885068 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2789e45f-da51-4236-a35b-bce9acd65d2c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-s2qlr\" (UID: \"2789e45f-da51-4236-a35b-bce9acd65d2c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885096 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p79cl\" (UniqueName: \"kubernetes.io/projected/2789e45f-da51-4236-a35b-bce9acd65d2c-kube-api-access-p79cl\") pod \"frr-k8s-webhook-server-7df86c4f6c-s2qlr\" (UID: \"2789e45f-da51-4236-a35b-bce9acd65d2c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885121 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-cert\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885153 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-reloader\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885176 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-metrics-certs\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885197 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-metrics\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.885220 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-metrics-certs\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.886018 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-metrics\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.886233 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-sockets\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.886297 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-conf\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.886373 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-reloader\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.886548 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-frr-startup\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.891477 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-metrics-certs\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.893402 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2789e45f-da51-4236-a35b-bce9acd65d2c-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-s2qlr\" (UID: \"2789e45f-da51-4236-a35b-bce9acd65d2c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.907078 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p79cl\" (UniqueName: \"kubernetes.io/projected/2789e45f-da51-4236-a35b-bce9acd65d2c-kube-api-access-p79cl\") pod \"frr-k8s-webhook-server-7df86c4f6c-s2qlr\" (UID: \"2789e45f-da51-4236-a35b-bce9acd65d2c\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.907829 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z2d8\" (UniqueName: \"kubernetes.io/projected/a92ad7b9-a7fd-49c2-b42e-132fa97b2228-kube-api-access-6z2d8\") pod \"frr-k8s-dvff6\" (UID: \"a92ad7b9-a7fd-49c2-b42e-132fa97b2228\") " pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.976892 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987340 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4f7\" (UniqueName: \"kubernetes.io/projected/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-kube-api-access-zq4f7\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987387 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-metrics-certs\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987405 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6vx\" (UniqueName: \"kubernetes.io/projected/4804b763-eb34-47b5-958c-dd672ac9a5be-kube-api-access-mz6vx\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987433 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-cert\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987481 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-metrics-certs\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987518 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.987533 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-metallb-excludel2\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.988139 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-metallb-excludel2\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.988376 4801 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.988432 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-metrics-certs podName:4804b763-eb34-47b5-958c-dd672ac9a5be nodeName:}" failed. No retries permitted until 2026-01-22 14:18:08.488417612 +0000 UTC m=+837.190317795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-metrics-certs") pod "controller-6968d8fdc4-fln4v" (UID: "4804b763-eb34-47b5-958c-dd672ac9a5be") : secret "controller-certs-secret" not found Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.988519 4801 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 22 14:18:07 crc kubenswrapper[4801]: E0122 14:18:07.988638 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist podName:3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df nodeName:}" failed. No retries permitted until 2026-01-22 14:18:08.488620318 +0000 UTC m=+837.190520501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist") pod "speaker-46zlw" (UID: "3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df") : secret "metallb-memberlist" not found Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.993010 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-metrics-certs\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.994041 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-cert\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:07 crc kubenswrapper[4801]: I0122 14:18:07.994153 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.009484 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4f7\" (UniqueName: \"kubernetes.io/projected/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-kube-api-access-zq4f7\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.009822 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6vx\" (UniqueName: \"kubernetes.io/projected/4804b763-eb34-47b5-958c-dd672ac9a5be-kube-api-access-mz6vx\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.293918 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr"] Jan 22 14:18:08 crc kubenswrapper[4801]: W0122 14:18:08.296863 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2789e45f_da51_4236_a35b_bce9acd65d2c.slice/crio-88f0e673737340b693d10fcc0522bd860231f81889ba282952086ac282e565bb WatchSource:0}: Error finding container 88f0e673737340b693d10fcc0522bd860231f81889ba282952086ac282e565bb: Status 404 returned error can't find the container with id 88f0e673737340b693d10fcc0522bd860231f81889ba282952086ac282e565bb Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.521251 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.521750 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-metrics-certs\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:08 crc kubenswrapper[4801]: E0122 14:18:08.521495 4801 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 22 14:18:08 crc kubenswrapper[4801]: E0122 14:18:08.521837 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist podName:3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df nodeName:}" failed. No retries permitted until 2026-01-22 14:18:09.521818912 +0000 UTC m=+838.223719095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist") pod "speaker-46zlw" (UID: "3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df") : secret "metallb-memberlist" not found Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.528185 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4804b763-eb34-47b5-958c-dd672ac9a5be-metrics-certs\") pod \"controller-6968d8fdc4-fln4v\" (UID: \"4804b763-eb34-47b5-958c-dd672ac9a5be\") " pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.694922 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.727364 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" event={"ID":"2789e45f-da51-4236-a35b-bce9acd65d2c","Type":"ContainerStarted","Data":"88f0e673737340b693d10fcc0522bd860231f81889ba282952086ac282e565bb"} Jan 22 14:18:08 crc kubenswrapper[4801]: I0122 14:18:08.933820 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-fln4v"] Jan 22 14:18:08 crc kubenswrapper[4801]: W0122 14:18:08.948695 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4804b763_eb34_47b5_958c_dd672ac9a5be.slice/crio-e0d5dc5fad82812160fabdc4d4681048fe1d110ebec4b38fc8c0dff5f080a1fe WatchSource:0}: Error finding container e0d5dc5fad82812160fabdc4d4681048fe1d110ebec4b38fc8c0dff5f080a1fe: Status 404 returned error can't find the container with id e0d5dc5fad82812160fabdc4d4681048fe1d110ebec4b38fc8c0dff5f080a1fe Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.537500 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.552047 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df-memberlist\") pod \"speaker-46zlw\" (UID: \"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df\") " pod="metallb-system/speaker-46zlw" Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.570214 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-46zlw" Jan 22 14:18:09 crc kubenswrapper[4801]: W0122 14:18:09.591062 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcb9bca_f8c6_4a31_87cd_41f7b4b3a6df.slice/crio-1ef6c9c977573979cbad2e2a8287d1ec33b43ee15f0d13747111c97ec3875b5b WatchSource:0}: Error finding container 1ef6c9c977573979cbad2e2a8287d1ec33b43ee15f0d13747111c97ec3875b5b: Status 404 returned error can't find the container with id 1ef6c9c977573979cbad2e2a8287d1ec33b43ee15f0d13747111c97ec3875b5b Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.735738 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-46zlw" event={"ID":"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df","Type":"ContainerStarted","Data":"1ef6c9c977573979cbad2e2a8287d1ec33b43ee15f0d13747111c97ec3875b5b"} Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.738424 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-fln4v" event={"ID":"4804b763-eb34-47b5-958c-dd672ac9a5be","Type":"ContainerStarted","Data":"993b1ab34004326f2bc220508ddf8fe3f136a99a2f6475d8ff307978cc9195e8"} Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.738519 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-fln4v" event={"ID":"4804b763-eb34-47b5-958c-dd672ac9a5be","Type":"ContainerStarted","Data":"e9a123e25972d5bb68a8b9bd38ff0fb30fd3f23040e6e7eca3f7ee7499c001e0"} Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.738532 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-fln4v" event={"ID":"4804b763-eb34-47b5-958c-dd672ac9a5be","Type":"ContainerStarted","Data":"e0d5dc5fad82812160fabdc4d4681048fe1d110ebec4b38fc8c0dff5f080a1fe"} Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.738561 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:09 crc kubenswrapper[4801]: I0122 14:18:09.739844 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"5aadfea70c42ee3a0faa99adb34d2edbc2fbf085003d97f2d8ff5fe1db429916"} Jan 22 14:18:10 crc kubenswrapper[4801]: I0122 14:18:10.755942 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-46zlw" event={"ID":"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df","Type":"ContainerStarted","Data":"a1cc553a9ca59b7fabb52a4ed82fef9d9b0d479f947534254f9d06ab3fcd5266"} Jan 22 14:18:10 crc kubenswrapper[4801]: I0122 14:18:10.755989 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-46zlw" event={"ID":"3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df","Type":"ContainerStarted","Data":"06b29a2fbc49af780fbd262a522c0b577e8bb811c659895ff19961837c2df409"} Jan 22 14:18:10 crc kubenswrapper[4801]: I0122 14:18:10.775142 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-fln4v" podStartSLOduration=3.775123825 podStartE2EDuration="3.775123825s" podCreationTimestamp="2026-01-22 14:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:18:09.759332144 +0000 UTC m=+838.461232347" watchObservedRunningTime="2026-01-22 14:18:10.775123825 +0000 UTC m=+839.477024008" Jan 22 14:18:10 crc kubenswrapper[4801]: I0122 14:18:10.777717 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-46zlw" podStartSLOduration=3.777711618 podStartE2EDuration="3.777711618s" podCreationTimestamp="2026-01-22 14:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:18:10.774636421 +0000 UTC m=+839.476536614" watchObservedRunningTime="2026-01-22 14:18:10.777711618 +0000 UTC m=+839.479611801" Jan 22 14:18:11 crc kubenswrapper[4801]: I0122 14:18:11.762494 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-46zlw" Jan 22 14:18:17 crc kubenswrapper[4801]: I0122 14:18:17.929155 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" event={"ID":"2789e45f-da51-4236-a35b-bce9acd65d2c","Type":"ContainerStarted","Data":"6d2ba2d20ac55e9d6884487fc0c71162c3e63f8c5770411b45efdaa78a120c9d"} Jan 22 14:18:17 crc kubenswrapper[4801]: I0122 14:18:17.930014 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:17 crc kubenswrapper[4801]: I0122 14:18:17.931614 4801 generic.go:334] "Generic (PLEG): container finished" podID="a92ad7b9-a7fd-49c2-b42e-132fa97b2228" containerID="ee21c91d731e45afb980a0fb4ef8ba4529faeff97cfacf00c417cccc83e1fa4e" exitCode=0 Jan 22 14:18:17 crc kubenswrapper[4801]: I0122 14:18:17.931652 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerDied","Data":"ee21c91d731e45afb980a0fb4ef8ba4529faeff97cfacf00c417cccc83e1fa4e"} Jan 22 14:18:17 crc kubenswrapper[4801]: I0122 14:18:17.977558 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" podStartSLOduration=2.132689174 podStartE2EDuration="10.977439328s" podCreationTimestamp="2026-01-22 14:18:07 +0000 UTC" firstStartedPulling="2026-01-22 14:18:08.298896896 +0000 UTC m=+837.000797079" lastFinishedPulling="2026-01-22 14:18:17.14364704 +0000 UTC m=+845.845547233" observedRunningTime="2026-01-22 14:18:17.95076095 +0000 UTC m=+846.652661183" watchObservedRunningTime="2026-01-22 14:18:17.977439328 +0000 UTC m=+846.679339511" Jan 22 14:18:18 crc kubenswrapper[4801]: I0122 14:18:18.939786 4801 generic.go:334] "Generic (PLEG): container finished" podID="a92ad7b9-a7fd-49c2-b42e-132fa97b2228" containerID="b5cc372df721cbd0eefdb80685d211c0cb19fd150c65b46532b7e04607685074" exitCode=0 Jan 22 14:18:18 crc kubenswrapper[4801]: I0122 14:18:18.939901 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerDied","Data":"b5cc372df721cbd0eefdb80685d211c0cb19fd150c65b46532b7e04607685074"} Jan 22 14:18:19 crc kubenswrapper[4801]: I0122 14:18:19.591932 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-46zlw" Jan 22 14:18:19 crc kubenswrapper[4801]: I0122 14:18:19.958508 4801 generic.go:334] "Generic (PLEG): container finished" podID="a92ad7b9-a7fd-49c2-b42e-132fa97b2228" containerID="4f540a598f72b532d839afe77730ef956a283108cca1a7689517b689d04469c1" exitCode=0 Jan 22 14:18:19 crc kubenswrapper[4801]: I0122 14:18:19.958549 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerDied","Data":"4f540a598f72b532d839afe77730ef956a283108cca1a7689517b689d04469c1"} Jan 22 14:18:20 crc kubenswrapper[4801]: I0122 14:18:20.925810 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc"] Jan 22 14:18:20 crc kubenswrapper[4801]: I0122 14:18:20.927211 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:20 crc kubenswrapper[4801]: I0122 14:18:20.929996 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.944141 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc"] Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.967808 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"17ea9265f509f7eef5eeedf51aac437cdbbfaacda6505618a317e89d6d6499f8"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.967851 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"33352c6266393e8e4835880dc831ab3985e36744a82ea8eb426605c1bb3be58f"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.967863 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"1588baf41e2745173a649fc62b90ed8689cefc4d8556d83503c4589e320ad092"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.967875 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"a8b2b7642f37663dfb75b6cc300660b38a09fa2787e037319c2a84801516296e"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.967886 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"8735eeb0263d4ba130dd09a70ca1cc9d09e29c99f3833b87b329caaba91ce647"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.967895 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dvff6" event={"ID":"a92ad7b9-a7fd-49c2-b42e-132fa97b2228","Type":"ContainerStarted","Data":"a88aced1402e3b05aca14429327528eb40bc6bcd458d4d9a795766ad40d12c58"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:20.969112 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.052538 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.052816 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.052901 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cmgr\" (UniqueName: \"kubernetes.io/projected/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-kube-api-access-6cmgr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.154901 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.154947 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cmgr\" (UniqueName: \"kubernetes.io/projected/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-kube-api-access-6cmgr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.155006 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.155761 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.155805 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.181205 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cmgr\" (UniqueName: \"kubernetes.io/projected/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-kube-api-access-6cmgr\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.244531 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.657442 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dvff6" podStartSLOduration=6.493780695 podStartE2EDuration="14.65742641s" podCreationTimestamp="2026-01-22 14:18:07 +0000 UTC" firstStartedPulling="2026-01-22 14:18:08.995342611 +0000 UTC m=+837.697242804" lastFinishedPulling="2026-01-22 14:18:17.158988326 +0000 UTC m=+845.860888519" observedRunningTime="2026-01-22 14:18:20.992027748 +0000 UTC m=+849.693927931" watchObservedRunningTime="2026-01-22 14:18:21.65742641 +0000 UTC m=+850.359326593" Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.658499 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc"] Jan 22 14:18:21 crc kubenswrapper[4801]: W0122 14:18:21.668307 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6341e1c6_5dc9_4d69_bd26_904c36a57f1b.slice/crio-956b0f7ecc5800b9076c4e807285283a3a0cc95d21c4f25647324ca32c61dfd6 WatchSource:0}: Error finding container 956b0f7ecc5800b9076c4e807285283a3a0cc95d21c4f25647324ca32c61dfd6: Status 404 returned error can't find the container with id 956b0f7ecc5800b9076c4e807285283a3a0cc95d21c4f25647324ca32c61dfd6 Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.975163 4801 generic.go:334] "Generic (PLEG): container finished" podID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerID="297724934e9182313a4ea750fcfd4625021625e5c4a1c5ac3cd61613123ca701" exitCode=0 Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.975240 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" event={"ID":"6341e1c6-5dc9-4d69-bd26-904c36a57f1b","Type":"ContainerDied","Data":"297724934e9182313a4ea750fcfd4625021625e5c4a1c5ac3cd61613123ca701"} Jan 22 14:18:21 crc kubenswrapper[4801]: I0122 14:18:21.975581 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" event={"ID":"6341e1c6-5dc9-4d69-bd26-904c36a57f1b","Type":"ContainerStarted","Data":"956b0f7ecc5800b9076c4e807285283a3a0cc95d21c4f25647324ca32c61dfd6"} Jan 22 14:18:22 crc kubenswrapper[4801]: I0122 14:18:22.978118 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:23 crc kubenswrapper[4801]: I0122 14:18:23.018433 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:27 crc kubenswrapper[4801]: I0122 14:18:27.011991 4801 generic.go:334] "Generic (PLEG): container finished" podID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerID="d1fe5a52bddeba43219a7ae959dd2abc430d86e71c72efcf482ad16a48bb68b8" exitCode=0 Jan 22 14:18:27 crc kubenswrapper[4801]: I0122 14:18:27.012051 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" event={"ID":"6341e1c6-5dc9-4d69-bd26-904c36a57f1b","Type":"ContainerDied","Data":"d1fe5a52bddeba43219a7ae959dd2abc430d86e71c72efcf482ad16a48bb68b8"} Jan 22 14:18:28 crc kubenswrapper[4801]: I0122 14:18:28.020505 4801 generic.go:334] "Generic (PLEG): container finished" podID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerID="09f2b7df8c270c9ea7754fad9834af632a56f28ca178da39ef7fe26b7ad3c798" exitCode=0 Jan 22 14:18:28 crc kubenswrapper[4801]: I0122 14:18:28.020560 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" event={"ID":"6341e1c6-5dc9-4d69-bd26-904c36a57f1b","Type":"ContainerDied","Data":"09f2b7df8c270c9ea7754fad9834af632a56f28ca178da39ef7fe26b7ad3c798"} Jan 22 14:18:28 crc kubenswrapper[4801]: I0122 14:18:28.054601 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-s2qlr" Jan 22 14:18:28 crc kubenswrapper[4801]: I0122 14:18:28.699240 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-fln4v" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.265158 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9lwrs"] Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.280278 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lwrs"] Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.280421 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.339056 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.374122 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-utilities\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.374179 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-catalog-content\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.374209 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728w2\" (UniqueName: \"kubernetes.io/projected/64c6c5bb-4951-472c-bc14-1fcddc01593a-kube-api-access-728w2\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.475251 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cmgr\" (UniqueName: \"kubernetes.io/projected/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-kube-api-access-6cmgr\") pod \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.475319 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-util\") pod \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.475345 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-bundle\") pod \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\" (UID: \"6341e1c6-5dc9-4d69-bd26-904c36a57f1b\") " Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.475595 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-utilities\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.475631 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-catalog-content\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.475657 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728w2\" (UniqueName: \"kubernetes.io/projected/64c6c5bb-4951-472c-bc14-1fcddc01593a-kube-api-access-728w2\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.476132 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-catalog-content\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.476250 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-utilities\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.476542 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-bundle" (OuterVolumeSpecName: "bundle") pod "6341e1c6-5dc9-4d69-bd26-904c36a57f1b" (UID: "6341e1c6-5dc9-4d69-bd26-904c36a57f1b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.480852 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-kube-api-access-6cmgr" (OuterVolumeSpecName: "kube-api-access-6cmgr") pod "6341e1c6-5dc9-4d69-bd26-904c36a57f1b" (UID: "6341e1c6-5dc9-4d69-bd26-904c36a57f1b"). InnerVolumeSpecName "kube-api-access-6cmgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.493557 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-util" (OuterVolumeSpecName: "util") pod "6341e1c6-5dc9-4d69-bd26-904c36a57f1b" (UID: "6341e1c6-5dc9-4d69-bd26-904c36a57f1b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.496540 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728w2\" (UniqueName: \"kubernetes.io/projected/64c6c5bb-4951-472c-bc14-1fcddc01593a-kube-api-access-728w2\") pod \"certified-operators-9lwrs\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.582275 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cmgr\" (UniqueName: \"kubernetes.io/projected/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-kube-api-access-6cmgr\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.582313 4801 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-util\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.582326 4801 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6341e1c6-5dc9-4d69-bd26-904c36a57f1b-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:29 crc kubenswrapper[4801]: I0122 14:18:29.636387 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:30 crc kubenswrapper[4801]: I0122 14:18:30.035815 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" event={"ID":"6341e1c6-5dc9-4d69-bd26-904c36a57f1b","Type":"ContainerDied","Data":"956b0f7ecc5800b9076c4e807285283a3a0cc95d21c4f25647324ca32c61dfd6"} Jan 22 14:18:30 crc kubenswrapper[4801]: I0122 14:18:30.036054 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956b0f7ecc5800b9076c4e807285283a3a0cc95d21c4f25647324ca32c61dfd6" Jan 22 14:18:30 crc kubenswrapper[4801]: I0122 14:18:30.035882 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc" Jan 22 14:18:30 crc kubenswrapper[4801]: I0122 14:18:30.095167 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9lwrs"] Jan 22 14:18:30 crc kubenswrapper[4801]: W0122 14:18:30.101855 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64c6c5bb_4951_472c_bc14_1fcddc01593a.slice/crio-017989d49a66c5b7544f1aabac8def517a7479fc388cfbf84171b9b6ce58af78 WatchSource:0}: Error finding container 017989d49a66c5b7544f1aabac8def517a7479fc388cfbf84171b9b6ce58af78: Status 404 returned error can't find the container with id 017989d49a66c5b7544f1aabac8def517a7479fc388cfbf84171b9b6ce58af78 Jan 22 14:18:31 crc kubenswrapper[4801]: I0122 14:18:31.043504 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerStarted","Data":"017989d49a66c5b7544f1aabac8def517a7479fc388cfbf84171b9b6ce58af78"} Jan 22 14:18:32 crc kubenswrapper[4801]: I0122 14:18:32.051635 4801 generic.go:334] "Generic (PLEG): container finished" podID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerID="1a8bd403518d3e967014d87d8f36db68a98b85fabd22105f0be697949b256b33" exitCode=0 Jan 22 14:18:32 crc kubenswrapper[4801]: I0122 14:18:32.051683 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerDied","Data":"1a8bd403518d3e967014d87d8f36db68a98b85fabd22105f0be697949b256b33"} Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.057834 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerStarted","Data":"52cb44a3947b6007086f293f256c623de20ec365b480207eae89aa3f1b3798d9"} Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.588709 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s"] Jan 22 14:18:33 crc kubenswrapper[4801]: E0122 14:18:33.589359 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="util" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.589388 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="util" Jan 22 14:18:33 crc kubenswrapper[4801]: E0122 14:18:33.589412 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="extract" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.589424 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="extract" Jan 22 14:18:33 crc kubenswrapper[4801]: E0122 14:18:33.589467 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="pull" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.589476 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="pull" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.589652 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="6341e1c6-5dc9-4d69-bd26-904c36a57f1b" containerName="extract" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.590182 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.594295 4801 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-jwbld" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.594655 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.594846 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.624543 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s"] Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.738072 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjtv\" (UniqueName: \"kubernetes.io/projected/c4e15cce-995c-4bf6-b989-a37657fb62cc-kube-api-access-4hjtv\") pod \"cert-manager-operator-controller-manager-64cf6dff88-cmj5s\" (UID: \"c4e15cce-995c-4bf6-b989-a37657fb62cc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.738126 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4e15cce-995c-4bf6-b989-a37657fb62cc-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-cmj5s\" (UID: \"c4e15cce-995c-4bf6-b989-a37657fb62cc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.839967 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjtv\" (UniqueName: \"kubernetes.io/projected/c4e15cce-995c-4bf6-b989-a37657fb62cc-kube-api-access-4hjtv\") pod \"cert-manager-operator-controller-manager-64cf6dff88-cmj5s\" (UID: \"c4e15cce-995c-4bf6-b989-a37657fb62cc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.840032 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4e15cce-995c-4bf6-b989-a37657fb62cc-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-cmj5s\" (UID: \"c4e15cce-995c-4bf6-b989-a37657fb62cc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.840668 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4e15cce-995c-4bf6-b989-a37657fb62cc-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-cmj5s\" (UID: \"c4e15cce-995c-4bf6-b989-a37657fb62cc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.865466 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjtv\" (UniqueName: \"kubernetes.io/projected/c4e15cce-995c-4bf6-b989-a37657fb62cc-kube-api-access-4hjtv\") pod \"cert-manager-operator-controller-manager-64cf6dff88-cmj5s\" (UID: \"c4e15cce-995c-4bf6-b989-a37657fb62cc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:33 crc kubenswrapper[4801]: I0122 14:18:33.905699 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" Jan 22 14:18:34 crc kubenswrapper[4801]: I0122 14:18:34.021154 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:18:34 crc kubenswrapper[4801]: I0122 14:18:34.021219 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:18:34 crc kubenswrapper[4801]: I0122 14:18:34.269552 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s"] Jan 22 14:18:34 crc kubenswrapper[4801]: W0122 14:18:34.273967 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e15cce_995c_4bf6_b989_a37657fb62cc.slice/crio-e2c7eede35187978e1fb6cbb1dae513b11d147b811cf32c1559015ec55ecacc2 WatchSource:0}: Error finding container e2c7eede35187978e1fb6cbb1dae513b11d147b811cf32c1559015ec55ecacc2: Status 404 returned error can't find the container with id e2c7eede35187978e1fb6cbb1dae513b11d147b811cf32c1559015ec55ecacc2 Jan 22 14:18:35 crc kubenswrapper[4801]: I0122 14:18:35.071831 4801 generic.go:334] "Generic (PLEG): container finished" podID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerID="52cb44a3947b6007086f293f256c623de20ec365b480207eae89aa3f1b3798d9" exitCode=0 Jan 22 14:18:35 crc kubenswrapper[4801]: I0122 14:18:35.071875 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerDied","Data":"52cb44a3947b6007086f293f256c623de20ec365b480207eae89aa3f1b3798d9"} Jan 22 14:18:35 crc kubenswrapper[4801]: I0122 14:18:35.073282 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" event={"ID":"c4e15cce-995c-4bf6-b989-a37657fb62cc","Type":"ContainerStarted","Data":"e2c7eede35187978e1fb6cbb1dae513b11d147b811cf32c1559015ec55ecacc2"} Jan 22 14:18:37 crc kubenswrapper[4801]: I0122 14:18:37.096139 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerStarted","Data":"756cbeb5eae8e80c5b275233a2dc405c9257ea94a1993f00dcb2066b3fdf2a25"} Jan 22 14:18:37 crc kubenswrapper[4801]: I0122 14:18:37.118788 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9lwrs" podStartSLOduration=3.67897591 podStartE2EDuration="8.118769267s" podCreationTimestamp="2026-01-22 14:18:29 +0000 UTC" firstStartedPulling="2026-01-22 14:18:32.053347928 +0000 UTC m=+860.755248111" lastFinishedPulling="2026-01-22 14:18:36.493141275 +0000 UTC m=+865.195041468" observedRunningTime="2026-01-22 14:18:37.113999321 +0000 UTC m=+865.815899514" watchObservedRunningTime="2026-01-22 14:18:37.118769267 +0000 UTC m=+865.820669450" Jan 22 14:18:37 crc kubenswrapper[4801]: I0122 14:18:37.981909 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dvff6" Jan 22 14:18:39 crc kubenswrapper[4801]: I0122 14:18:39.637242 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:39 crc kubenswrapper[4801]: I0122 14:18:39.637623 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:39 crc kubenswrapper[4801]: I0122 14:18:39.698167 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:39 crc kubenswrapper[4801]: I0122 14:18:39.853111 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8lbtv"] Jan 22 14:18:39 crc kubenswrapper[4801]: I0122 14:18:39.855934 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:39 crc kubenswrapper[4801]: I0122 14:18:39.863326 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lbtv"] Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:39.938436 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-utilities\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:39.938516 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-catalog-content\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:39.938559 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4rq\" (UniqueName: \"kubernetes.io/projected/891454c4-2364-44df-8bea-d79ade6c7234-kube-api-access-9k4rq\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.039332 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-utilities\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.039398 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-catalog-content\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.039440 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4rq\" (UniqueName: \"kubernetes.io/projected/891454c4-2364-44df-8bea-d79ade6c7234-kube-api-access-9k4rq\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.040372 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-catalog-content\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.040564 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-utilities\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.172341 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4rq\" (UniqueName: \"kubernetes.io/projected/891454c4-2364-44df-8bea-d79ade6c7234-kube-api-access-9k4rq\") pod \"redhat-marketplace-8lbtv\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:40 crc kubenswrapper[4801]: I0122 14:18:40.179386 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:43 crc kubenswrapper[4801]: I0122 14:18:43.902762 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lbtv"] Jan 22 14:18:43 crc kubenswrapper[4801]: W0122 14:18:43.910037 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891454c4_2364_44df_8bea_d79ade6c7234.slice/crio-00e412c0111b5d3d30b9590705cdcff46b1cdb6ef8566beb450b2aa34efe2043 WatchSource:0}: Error finding container 00e412c0111b5d3d30b9590705cdcff46b1cdb6ef8566beb450b2aa34efe2043: Status 404 returned error can't find the container with id 00e412c0111b5d3d30b9590705cdcff46b1cdb6ef8566beb450b2aa34efe2043 Jan 22 14:18:44 crc kubenswrapper[4801]: I0122 14:18:44.135975 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" event={"ID":"c4e15cce-995c-4bf6-b989-a37657fb62cc","Type":"ContainerStarted","Data":"ad3f70be776b643e7016f8190ab040660d0a072172b2b410309e13c4ca5b6842"} Jan 22 14:18:44 crc kubenswrapper[4801]: I0122 14:18:44.137432 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerStarted","Data":"00e412c0111b5d3d30b9590705cdcff46b1cdb6ef8566beb450b2aa34efe2043"} Jan 22 14:18:44 crc kubenswrapper[4801]: I0122 14:18:44.154869 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-cmj5s" podStartSLOduration=1.860501323 podStartE2EDuration="11.154843053s" podCreationTimestamp="2026-01-22 14:18:33 +0000 UTC" firstStartedPulling="2026-01-22 14:18:34.276373521 +0000 UTC m=+862.978273704" lastFinishedPulling="2026-01-22 14:18:43.570715251 +0000 UTC m=+872.272615434" observedRunningTime="2026-01-22 14:18:44.154318259 +0000 UTC m=+872.856218442" watchObservedRunningTime="2026-01-22 14:18:44.154843053 +0000 UTC m=+872.856743236" Jan 22 14:18:45 crc kubenswrapper[4801]: I0122 14:18:45.143158 4801 generic.go:334] "Generic (PLEG): container finished" podID="891454c4-2364-44df-8bea-d79ade6c7234" containerID="9e8f1015f8f8b30b51bfc5b6c5b745bf0b5f507dc91b39e211f272985e1afb65" exitCode=0 Jan 22 14:18:45 crc kubenswrapper[4801]: I0122 14:18:45.143259 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerDied","Data":"9e8f1015f8f8b30b51bfc5b6c5b745bf0b5f507dc91b39e211f272985e1afb65"} Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.151562 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerStarted","Data":"9f675dffb46d783f98d2d097f48e9e5e52aed0efab07597eeea8a9191a2637a6"} Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.588705 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-5cn2c"] Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.589626 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.591563 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.591563 4801 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6p4ts" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.591564 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.602498 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-5cn2c"] Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.641343 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c56e565-c300-436f-8c6a-a8e8a366a124-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-5cn2c\" (UID: \"4c56e565-c300-436f-8c6a-a8e8a366a124\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.641392 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/4c56e565-c300-436f-8c6a-a8e8a366a124-kube-api-access-gqgrw\") pod \"cert-manager-webhook-f4fb5df64-5cn2c\" (UID: \"4c56e565-c300-436f-8c6a-a8e8a366a124\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.742309 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c56e565-c300-436f-8c6a-a8e8a366a124-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-5cn2c\" (UID: \"4c56e565-c300-436f-8c6a-a8e8a366a124\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.742361 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/4c56e565-c300-436f-8c6a-a8e8a366a124-kube-api-access-gqgrw\") pod \"cert-manager-webhook-f4fb5df64-5cn2c\" (UID: \"4c56e565-c300-436f-8c6a-a8e8a366a124\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.761692 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c56e565-c300-436f-8c6a-a8e8a366a124-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-5cn2c\" (UID: \"4c56e565-c300-436f-8c6a-a8e8a366a124\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.762299 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/4c56e565-c300-436f-8c6a-a8e8a366a124-kube-api-access-gqgrw\") pod \"cert-manager-webhook-f4fb5df64-5cn2c\" (UID: \"4c56e565-c300-436f-8c6a-a8e8a366a124\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:46 crc kubenswrapper[4801]: I0122 14:18:46.904081 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:47 crc kubenswrapper[4801]: I0122 14:18:47.161596 4801 generic.go:334] "Generic (PLEG): container finished" podID="891454c4-2364-44df-8bea-d79ade6c7234" containerID="9f675dffb46d783f98d2d097f48e9e5e52aed0efab07597eeea8a9191a2637a6" exitCode=0 Jan 22 14:18:47 crc kubenswrapper[4801]: I0122 14:18:47.161676 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerDied","Data":"9f675dffb46d783f98d2d097f48e9e5e52aed0efab07597eeea8a9191a2637a6"} Jan 22 14:18:47 crc kubenswrapper[4801]: I0122 14:18:47.330981 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-5cn2c"] Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.173295 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" event={"ID":"4c56e565-c300-436f-8c6a-a8e8a366a124","Type":"ContainerStarted","Data":"19855442154c10e2853615a69763cc15b92bff8e8f2fa55ec8f4e70c145a5efd"} Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.175261 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerStarted","Data":"5ebc2209f482e696025ec02bea118b25f5a0a22e78f81742d22b1f2f302d0078"} Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.805863 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8lbtv" podStartSLOduration=7.306474336 podStartE2EDuration="9.805845752s" podCreationTimestamp="2026-01-22 14:18:39 +0000 UTC" firstStartedPulling="2026-01-22 14:18:45.144716088 +0000 UTC m=+873.846616271" lastFinishedPulling="2026-01-22 14:18:47.644087504 +0000 UTC m=+876.345987687" observedRunningTime="2026-01-22 14:18:48.201166357 +0000 UTC m=+876.903066560" watchObservedRunningTime="2026-01-22 14:18:48.805845752 +0000 UTC m=+877.507745925" Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.806430 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn"] Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.807537 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.809708 4801 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6r26k" Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.812067 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn"] Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.966831 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/405b6c3b-1654-4cf5-a56d-c7670af97153-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-4vmgn\" (UID: \"405b6c3b-1654-4cf5-a56d-c7670af97153\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:48 crc kubenswrapper[4801]: I0122 14:18:48.966935 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcph\" (UniqueName: \"kubernetes.io/projected/405b6c3b-1654-4cf5-a56d-c7670af97153-kube-api-access-xdcph\") pod \"cert-manager-cainjector-855d9ccff4-4vmgn\" (UID: \"405b6c3b-1654-4cf5-a56d-c7670af97153\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.070055 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/405b6c3b-1654-4cf5-a56d-c7670af97153-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-4vmgn\" (UID: \"405b6c3b-1654-4cf5-a56d-c7670af97153\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.070129 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcph\" (UniqueName: \"kubernetes.io/projected/405b6c3b-1654-4cf5-a56d-c7670af97153-kube-api-access-xdcph\") pod \"cert-manager-cainjector-855d9ccff4-4vmgn\" (UID: \"405b6c3b-1654-4cf5-a56d-c7670af97153\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.087774 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/405b6c3b-1654-4cf5-a56d-c7670af97153-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-4vmgn\" (UID: \"405b6c3b-1654-4cf5-a56d-c7670af97153\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.087930 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcph\" (UniqueName: \"kubernetes.io/projected/405b6c3b-1654-4cf5-a56d-c7670af97153-kube-api-access-xdcph\") pod \"cert-manager-cainjector-855d9ccff4-4vmgn\" (UID: \"405b6c3b-1654-4cf5-a56d-c7670af97153\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.132690 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.522707 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn"] Jan 22 14:18:49 crc kubenswrapper[4801]: I0122 14:18:49.685672 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:50 crc kubenswrapper[4801]: I0122 14:18:50.180021 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:50 crc kubenswrapper[4801]: I0122 14:18:50.180069 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:50 crc kubenswrapper[4801]: I0122 14:18:50.188372 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" event={"ID":"405b6c3b-1654-4cf5-a56d-c7670af97153","Type":"ContainerStarted","Data":"6b9671e81e516d7e8263c9a0f9d09a55eef10fef6cfcd8de12f463a2e38081d1"} Jan 22 14:18:50 crc kubenswrapper[4801]: I0122 14:18:50.318995 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:50 crc kubenswrapper[4801]: I0122 14:18:50.849002 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lwrs"] Jan 22 14:18:50 crc kubenswrapper[4801]: I0122 14:18:50.849297 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9lwrs" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="registry-server" containerID="cri-o://756cbeb5eae8e80c5b275233a2dc405c9257ea94a1993f00dcb2066b3fdf2a25" gracePeriod=2 Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.198097 4801 generic.go:334] "Generic (PLEG): container finished" podID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerID="756cbeb5eae8e80c5b275233a2dc405c9257ea94a1993f00dcb2066b3fdf2a25" exitCode=0 Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.198910 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerDied","Data":"756cbeb5eae8e80c5b275233a2dc405c9257ea94a1993f00dcb2066b3fdf2a25"} Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.772676 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.808839 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-728w2\" (UniqueName: \"kubernetes.io/projected/64c6c5bb-4951-472c-bc14-1fcddc01593a-kube-api-access-728w2\") pod \"64c6c5bb-4951-472c-bc14-1fcddc01593a\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.808923 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-utilities\") pod \"64c6c5bb-4951-472c-bc14-1fcddc01593a\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.809040 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-catalog-content\") pod \"64c6c5bb-4951-472c-bc14-1fcddc01593a\" (UID: \"64c6c5bb-4951-472c-bc14-1fcddc01593a\") " Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.815060 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-utilities" (OuterVolumeSpecName: "utilities") pod "64c6c5bb-4951-472c-bc14-1fcddc01593a" (UID: "64c6c5bb-4951-472c-bc14-1fcddc01593a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.849274 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c6c5bb-4951-472c-bc14-1fcddc01593a-kube-api-access-728w2" (OuterVolumeSpecName: "kube-api-access-728w2") pod "64c6c5bb-4951-472c-bc14-1fcddc01593a" (UID: "64c6c5bb-4951-472c-bc14-1fcddc01593a"). InnerVolumeSpecName "kube-api-access-728w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.904979 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64c6c5bb-4951-472c-bc14-1fcddc01593a" (UID: "64c6c5bb-4951-472c-bc14-1fcddc01593a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.910315 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.910354 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-728w2\" (UniqueName: \"kubernetes.io/projected/64c6c5bb-4951-472c-bc14-1fcddc01593a-kube-api-access-728w2\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:51 crc kubenswrapper[4801]: I0122 14:18:51.910368 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64c6c5bb-4951-472c-bc14-1fcddc01593a-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:52 crc kubenswrapper[4801]: I0122 14:18:52.210782 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9lwrs" event={"ID":"64c6c5bb-4951-472c-bc14-1fcddc01593a","Type":"ContainerDied","Data":"017989d49a66c5b7544f1aabac8def517a7479fc388cfbf84171b9b6ce58af78"} Jan 22 14:18:52 crc kubenswrapper[4801]: I0122 14:18:52.210843 4801 scope.go:117] "RemoveContainer" containerID="756cbeb5eae8e80c5b275233a2dc405c9257ea94a1993f00dcb2066b3fdf2a25" Jan 22 14:18:52 crc kubenswrapper[4801]: I0122 14:18:52.210797 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9lwrs" Jan 22 14:18:52 crc kubenswrapper[4801]: I0122 14:18:52.246138 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9lwrs"] Jan 22 14:18:52 crc kubenswrapper[4801]: I0122 14:18:52.256528 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9lwrs"] Jan 22 14:18:52 crc kubenswrapper[4801]: I0122 14:18:52.278353 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:53 crc kubenswrapper[4801]: I0122 14:18:53.582525 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" path="/var/lib/kubelet/pods/64c6c5bb-4951-472c-bc14-1fcddc01593a/volumes" Jan 22 14:18:54 crc kubenswrapper[4801]: I0122 14:18:54.656166 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lbtv"] Jan 22 14:18:54 crc kubenswrapper[4801]: I0122 14:18:54.656420 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8lbtv" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="registry-server" containerID="cri-o://5ebc2209f482e696025ec02bea118b25f5a0a22e78f81742d22b1f2f302d0078" gracePeriod=2 Jan 22 14:18:55 crc kubenswrapper[4801]: I0122 14:18:55.234735 4801 generic.go:334] "Generic (PLEG): container finished" podID="891454c4-2364-44df-8bea-d79ade6c7234" containerID="5ebc2209f482e696025ec02bea118b25f5a0a22e78f81742d22b1f2f302d0078" exitCode=0 Jan 22 14:18:55 crc kubenswrapper[4801]: I0122 14:18:55.235376 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerDied","Data":"5ebc2209f482e696025ec02bea118b25f5a0a22e78f81742d22b1f2f302d0078"} Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.653181 4801 scope.go:117] "RemoveContainer" containerID="52cb44a3947b6007086f293f256c623de20ec365b480207eae89aa3f1b3798d9" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.724867 4801 scope.go:117] "RemoveContainer" containerID="1a8bd403518d3e967014d87d8f36db68a98b85fabd22105f0be697949b256b33" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.868734 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9872z"] Jan 22 14:18:57 crc kubenswrapper[4801]: E0122 14:18:57.874787 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="registry-server" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.874818 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="registry-server" Jan 22 14:18:57 crc kubenswrapper[4801]: E0122 14:18:57.874832 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="extract-content" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.874838 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="extract-content" Jan 22 14:18:57 crc kubenswrapper[4801]: E0122 14:18:57.874848 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="extract-utilities" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.874854 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="extract-utilities" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.874961 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c6c5bb-4951-472c-bc14-1fcddc01593a" containerName="registry-server" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.875266 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9872z"] Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.875341 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.877769 4801 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j8w9n" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.920848 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.952988 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frpvm\" (UniqueName: \"kubernetes.io/projected/2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de-kube-api-access-frpvm\") pod \"cert-manager-86cb77c54b-9872z\" (UID: \"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de\") " pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:57 crc kubenswrapper[4801]: I0122 14:18:57.953387 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de-bound-sa-token\") pod \"cert-manager-86cb77c54b-9872z\" (UID: \"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de\") " pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.054724 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-catalog-content\") pod \"891454c4-2364-44df-8bea-d79ade6c7234\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.054883 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-utilities\") pod \"891454c4-2364-44df-8bea-d79ade6c7234\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.054924 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k4rq\" (UniqueName: \"kubernetes.io/projected/891454c4-2364-44df-8bea-d79ade6c7234-kube-api-access-9k4rq\") pod \"891454c4-2364-44df-8bea-d79ade6c7234\" (UID: \"891454c4-2364-44df-8bea-d79ade6c7234\") " Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.055077 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de-bound-sa-token\") pod \"cert-manager-86cb77c54b-9872z\" (UID: \"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de\") " pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.055127 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frpvm\" (UniqueName: \"kubernetes.io/projected/2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de-kube-api-access-frpvm\") pod \"cert-manager-86cb77c54b-9872z\" (UID: \"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de\") " pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.059078 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-utilities" (OuterVolumeSpecName: "utilities") pod "891454c4-2364-44df-8bea-d79ade6c7234" (UID: "891454c4-2364-44df-8bea-d79ade6c7234"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.076494 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "891454c4-2364-44df-8bea-d79ade6c7234" (UID: "891454c4-2364-44df-8bea-d79ade6c7234"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.093842 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frpvm\" (UniqueName: \"kubernetes.io/projected/2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de-kube-api-access-frpvm\") pod \"cert-manager-86cb77c54b-9872z\" (UID: \"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de\") " pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.097271 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de-bound-sa-token\") pod \"cert-manager-86cb77c54b-9872z\" (UID: \"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de\") " pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.130573 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891454c4-2364-44df-8bea-d79ade6c7234-kube-api-access-9k4rq" (OuterVolumeSpecName: "kube-api-access-9k4rq") pod "891454c4-2364-44df-8bea-d79ade6c7234" (UID: "891454c4-2364-44df-8bea-d79ade6c7234"). InnerVolumeSpecName "kube-api-access-9k4rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.156821 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.156869 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k4rq\" (UniqueName: \"kubernetes.io/projected/891454c4-2364-44df-8bea-d79ade6c7234-kube-api-access-9k4rq\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.156880 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891454c4-2364-44df-8bea-d79ade6c7234-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.234339 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-9872z" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.256528 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" event={"ID":"405b6c3b-1654-4cf5-a56d-c7670af97153","Type":"ContainerStarted","Data":"f2d5476824e2640f355b022026c0abfd0d39d7f21c32ca368947310779cb2845"} Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.261184 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" event={"ID":"4c56e565-c300-436f-8c6a-a8e8a366a124","Type":"ContainerStarted","Data":"5aa3aa34405aca08e5e54332133051a6bd658ae439ff8a037c565cdf5d3bdf56"} Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.262051 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.265249 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lbtv" event={"ID":"891454c4-2364-44df-8bea-d79ade6c7234","Type":"ContainerDied","Data":"00e412c0111b5d3d30b9590705cdcff46b1cdb6ef8566beb450b2aa34efe2043"} Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.265286 4801 scope.go:117] "RemoveContainer" containerID="5ebc2209f482e696025ec02bea118b25f5a0a22e78f81742d22b1f2f302d0078" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.265427 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lbtv" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.295270 4801 scope.go:117] "RemoveContainer" containerID="9f675dffb46d783f98d2d097f48e9e5e52aed0efab07597eeea8a9191a2637a6" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.313575 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-4vmgn" podStartSLOduration=2.047441299 podStartE2EDuration="10.313548666s" podCreationTimestamp="2026-01-22 14:18:48 +0000 UTC" firstStartedPulling="2026-01-22 14:18:49.544047022 +0000 UTC m=+878.245947215" lastFinishedPulling="2026-01-22 14:18:57.810154399 +0000 UTC m=+886.512054582" observedRunningTime="2026-01-22 14:18:58.274709632 +0000 UTC m=+886.976609825" watchObservedRunningTime="2026-01-22 14:18:58.313548666 +0000 UTC m=+887.015448849" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.340400 4801 scope.go:117] "RemoveContainer" containerID="9e8f1015f8f8b30b51bfc5b6c5b745bf0b5f507dc91b39e211f272985e1afb65" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.346615 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" podStartSLOduration=1.902607772 podStartE2EDuration="12.346591016s" podCreationTimestamp="2026-01-22 14:18:46 +0000 UTC" firstStartedPulling="2026-01-22 14:18:47.343608784 +0000 UTC m=+876.045508987" lastFinishedPulling="2026-01-22 14:18:57.787592048 +0000 UTC m=+886.489492231" observedRunningTime="2026-01-22 14:18:58.305994502 +0000 UTC m=+887.007894685" watchObservedRunningTime="2026-01-22 14:18:58.346591016 +0000 UTC m=+887.048491219" Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.352820 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lbtv"] Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.356002 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lbtv"] Jan 22 14:18:58 crc kubenswrapper[4801]: I0122 14:18:58.690106 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9872z"] Jan 22 14:18:58 crc kubenswrapper[4801]: W0122 14:18:58.693033 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dbd57ed_5cef_4b6d_a11e_4afaf8b8b9de.slice/crio-a530d6eaec8c1e271a652a1577fea28f28556cd90c010e0dd38867e112618a9a WatchSource:0}: Error finding container a530d6eaec8c1e271a652a1577fea28f28556cd90c010e0dd38867e112618a9a: Status 404 returned error can't find the container with id a530d6eaec8c1e271a652a1577fea28f28556cd90c010e0dd38867e112618a9a Jan 22 14:18:59 crc kubenswrapper[4801]: I0122 14:18:59.272848 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-9872z" event={"ID":"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de","Type":"ContainerStarted","Data":"1102c20234a044622735ac59ee56bc90f749fad8aa00ed7b39fd4f60a9c0c665"} Jan 22 14:18:59 crc kubenswrapper[4801]: I0122 14:18:59.273216 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-9872z" event={"ID":"2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de","Type":"ContainerStarted","Data":"a530d6eaec8c1e271a652a1577fea28f28556cd90c010e0dd38867e112618a9a"} Jan 22 14:18:59 crc kubenswrapper[4801]: I0122 14:18:59.297598 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-9872z" podStartSLOduration=2.297575745 podStartE2EDuration="2.297575745s" podCreationTimestamp="2026-01-22 14:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:18:59.294877608 +0000 UTC m=+887.996777821" watchObservedRunningTime="2026-01-22 14:18:59.297575745 +0000 UTC m=+887.999475928" Jan 22 14:18:59 crc kubenswrapper[4801]: I0122 14:18:59.578428 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891454c4-2364-44df-8bea-d79ade6c7234" path="/var/lib/kubelet/pods/891454c4-2364-44df-8bea-d79ade6c7234/volumes" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.361889 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dkll9"] Jan 22 14:19:01 crc kubenswrapper[4801]: E0122 14:19:01.363580 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="registry-server" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.363707 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="registry-server" Jan 22 14:19:01 crc kubenswrapper[4801]: E0122 14:19:01.363816 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="extract-utilities" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.363911 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="extract-utilities" Jan 22 14:19:01 crc kubenswrapper[4801]: E0122 14:19:01.363986 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="extract-content" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.364053 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="extract-content" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.364271 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="891454c4-2364-44df-8bea-d79ade6c7234" containerName="registry-server" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.365361 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.377810 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dkll9"] Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.416474 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zjsh\" (UniqueName: \"kubernetes.io/projected/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-kube-api-access-6zjsh\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.416575 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-utilities\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.416613 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-catalog-content\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.518189 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-catalog-content\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.518246 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zjsh\" (UniqueName: \"kubernetes.io/projected/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-kube-api-access-6zjsh\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.518310 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-utilities\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.518734 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-utilities\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.518946 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-catalog-content\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.542591 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zjsh\" (UniqueName: \"kubernetes.io/projected/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-kube-api-access-6zjsh\") pod \"community-operators-dkll9\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.690389 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:01 crc kubenswrapper[4801]: I0122 14:19:01.956958 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dkll9"] Jan 22 14:19:02 crc kubenswrapper[4801]: I0122 14:19:02.290075 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkll9" event={"ID":"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1","Type":"ContainerStarted","Data":"849899feadf091203a9582a627e67e5c72585336d753f2496395624ccb9ef36a"} Jan 22 14:19:04 crc kubenswrapper[4801]: I0122 14:19:04.020977 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:19:04 crc kubenswrapper[4801]: I0122 14:19:04.021057 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:19:04 crc kubenswrapper[4801]: I0122 14:19:04.301855 4801 generic.go:334] "Generic (PLEG): container finished" podID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerID="1ad29790abd9a95818ac8477e537068ea4baef8f6e4f830448c281dbaaeeabb2" exitCode=0 Jan 22 14:19:04 crc kubenswrapper[4801]: I0122 14:19:04.302468 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkll9" event={"ID":"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1","Type":"ContainerDied","Data":"1ad29790abd9a95818ac8477e537068ea4baef8f6e4f830448c281dbaaeeabb2"} Jan 22 14:19:06 crc kubenswrapper[4801]: I0122 14:19:06.907372 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-5cn2c" Jan 22 14:19:08 crc kubenswrapper[4801]: I0122 14:19:08.325427 4801 generic.go:334] "Generic (PLEG): container finished" podID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerID="fc09efd38b67ed0ae8d85327a67484703d533857359e61486ccedcb4ac8d0ebe" exitCode=0 Jan 22 14:19:08 crc kubenswrapper[4801]: I0122 14:19:08.325583 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkll9" event={"ID":"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1","Type":"ContainerDied","Data":"fc09efd38b67ed0ae8d85327a67484703d533857359e61486ccedcb4ac8d0ebe"} Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.175827 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-24pvd"] Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.177102 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.179568 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.180366 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2mg55" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.180428 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.238322 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-24pvd"] Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.325429 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc999\" (UniqueName: \"kubernetes.io/projected/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb-kube-api-access-pc999\") pod \"openstack-operator-index-24pvd\" (UID: \"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb\") " pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.427009 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc999\" (UniqueName: \"kubernetes.io/projected/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb-kube-api-access-pc999\") pod \"openstack-operator-index-24pvd\" (UID: \"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb\") " pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.446178 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc999\" (UniqueName: \"kubernetes.io/projected/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb-kube-api-access-pc999\") pod \"openstack-operator-index-24pvd\" (UID: \"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb\") " pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.491031 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:10 crc kubenswrapper[4801]: I0122 14:19:10.919661 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-24pvd"] Jan 22 14:19:10 crc kubenswrapper[4801]: W0122 14:19:10.921009 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed10c24_b948_45b3_8ff3_8ab9205ccbbb.slice/crio-bf4ff3f7931f9374b945ab3a7ce09da2996f445e73f52279c728f5a94c6aa011 WatchSource:0}: Error finding container bf4ff3f7931f9374b945ab3a7ce09da2996f445e73f52279c728f5a94c6aa011: Status 404 returned error can't find the container with id bf4ff3f7931f9374b945ab3a7ce09da2996f445e73f52279c728f5a94c6aa011 Jan 22 14:19:11 crc kubenswrapper[4801]: I0122 14:19:11.349409 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-24pvd" event={"ID":"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb","Type":"ContainerStarted","Data":"bf4ff3f7931f9374b945ab3a7ce09da2996f445e73f52279c728f5a94c6aa011"} Jan 22 14:19:12 crc kubenswrapper[4801]: I0122 14:19:12.359958 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkll9" event={"ID":"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1","Type":"ContainerStarted","Data":"a629df026447392deb94e67c41b07e034bb00ca9b2ea6145531ee58169f0ee46"} Jan 22 14:19:12 crc kubenswrapper[4801]: I0122 14:19:12.382232 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dkll9" podStartSLOduration=5.713324272 podStartE2EDuration="11.382212401s" podCreationTimestamp="2026-01-22 14:19:01 +0000 UTC" firstStartedPulling="2026-01-22 14:19:04.304115269 +0000 UTC m=+893.006015452" lastFinishedPulling="2026-01-22 14:19:09.973003388 +0000 UTC m=+898.674903581" observedRunningTime="2026-01-22 14:19:12.381244734 +0000 UTC m=+901.083144927" watchObservedRunningTime="2026-01-22 14:19:12.382212401 +0000 UTC m=+901.084112584" Jan 22 14:19:15 crc kubenswrapper[4801]: I0122 14:19:15.109626 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-24pvd"] Jan 22 14:19:15 crc kubenswrapper[4801]: I0122 14:19:15.861369 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xrv9d"] Jan 22 14:19:15 crc kubenswrapper[4801]: I0122 14:19:15.863008 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:15 crc kubenswrapper[4801]: I0122 14:19:15.867044 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xrv9d"] Jan 22 14:19:15 crc kubenswrapper[4801]: I0122 14:19:15.998179 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47cs6\" (UniqueName: \"kubernetes.io/projected/1f553167-d9c0-4808-b942-348a04597e38-kube-api-access-47cs6\") pod \"openstack-operator-index-xrv9d\" (UID: \"1f553167-d9c0-4808-b942-348a04597e38\") " pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:16 crc kubenswrapper[4801]: I0122 14:19:16.100154 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47cs6\" (UniqueName: \"kubernetes.io/projected/1f553167-d9c0-4808-b942-348a04597e38-kube-api-access-47cs6\") pod \"openstack-operator-index-xrv9d\" (UID: \"1f553167-d9c0-4808-b942-348a04597e38\") " pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:16 crc kubenswrapper[4801]: I0122 14:19:16.119911 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47cs6\" (UniqueName: \"kubernetes.io/projected/1f553167-d9c0-4808-b942-348a04597e38-kube-api-access-47cs6\") pod \"openstack-operator-index-xrv9d\" (UID: \"1f553167-d9c0-4808-b942-348a04597e38\") " pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:16 crc kubenswrapper[4801]: I0122 14:19:16.185418 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:16 crc kubenswrapper[4801]: I0122 14:19:16.939524 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xrv9d"] Jan 22 14:19:16 crc kubenswrapper[4801]: W0122 14:19:16.947523 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f553167_d9c0_4808_b942_348a04597e38.slice/crio-b7243251ded3419ec43096721ea11f860f4877665e3a37d521348f9345d314d5 WatchSource:0}: Error finding container b7243251ded3419ec43096721ea11f860f4877665e3a37d521348f9345d314d5: Status 404 returned error can't find the container with id b7243251ded3419ec43096721ea11f860f4877665e3a37d521348f9345d314d5 Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.392917 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xrv9d" event={"ID":"1f553167-d9c0-4808-b942-348a04597e38","Type":"ContainerStarted","Data":"6f575443d04c8c2bab08ffe94332ce1967599b6ef41b57b0aee9166331d6b105"} Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.392969 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xrv9d" event={"ID":"1f553167-d9c0-4808-b942-348a04597e38","Type":"ContainerStarted","Data":"b7243251ded3419ec43096721ea11f860f4877665e3a37d521348f9345d314d5"} Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.394288 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-24pvd" event={"ID":"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb","Type":"ContainerStarted","Data":"e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44"} Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.394389 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-24pvd" podUID="0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" containerName="registry-server" containerID="cri-o://e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44" gracePeriod=2 Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.410279 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xrv9d" podStartSLOduration=2.314880537 podStartE2EDuration="2.410262088s" podCreationTimestamp="2026-01-22 14:19:15 +0000 UTC" firstStartedPulling="2026-01-22 14:19:16.951102827 +0000 UTC m=+905.653003010" lastFinishedPulling="2026-01-22 14:19:17.046484378 +0000 UTC m=+905.748384561" observedRunningTime="2026-01-22 14:19:17.408892589 +0000 UTC m=+906.110792792" watchObservedRunningTime="2026-01-22 14:19:17.410262088 +0000 UTC m=+906.112162271" Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.426649 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-24pvd" podStartSLOduration=1.745636879 podStartE2EDuration="7.426630503s" podCreationTimestamp="2026-01-22 14:19:10 +0000 UTC" firstStartedPulling="2026-01-22 14:19:10.923004228 +0000 UTC m=+899.624904411" lastFinishedPulling="2026-01-22 14:19:16.603997852 +0000 UTC m=+905.305898035" observedRunningTime="2026-01-22 14:19:17.421865367 +0000 UTC m=+906.123765550" watchObservedRunningTime="2026-01-22 14:19:17.426630503 +0000 UTC m=+906.128530696" Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.809330 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.930772 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc999\" (UniqueName: \"kubernetes.io/projected/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb-kube-api-access-pc999\") pod \"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb\" (UID: \"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb\") " Jan 22 14:19:17 crc kubenswrapper[4801]: I0122 14:19:17.936186 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb-kube-api-access-pc999" (OuterVolumeSpecName: "kube-api-access-pc999") pod "0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" (UID: "0ed10c24-b948-45b3-8ff3-8ab9205ccbbb"). InnerVolumeSpecName "kube-api-access-pc999". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.032412 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc999\" (UniqueName: \"kubernetes.io/projected/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb-kube-api-access-pc999\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.401488 4801 generic.go:334] "Generic (PLEG): container finished" podID="0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" containerID="e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44" exitCode=0 Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.401595 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-24pvd" Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.401616 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-24pvd" event={"ID":"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb","Type":"ContainerDied","Data":"e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44"} Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.401666 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-24pvd" event={"ID":"0ed10c24-b948-45b3-8ff3-8ab9205ccbbb","Type":"ContainerDied","Data":"bf4ff3f7931f9374b945ab3a7ce09da2996f445e73f52279c728f5a94c6aa011"} Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.401684 4801 scope.go:117] "RemoveContainer" containerID="e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44" Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.417554 4801 scope.go:117] "RemoveContainer" containerID="e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44" Jan 22 14:19:18 crc kubenswrapper[4801]: E0122 14:19:18.417932 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44\": container with ID starting with e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44 not found: ID does not exist" containerID="e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44" Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.417971 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44"} err="failed to get container status \"e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44\": rpc error: code = NotFound desc = could not find container \"e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44\": container with ID starting with e92b921b2b01cd5d98fd404689c48759565f79579bc8440795b3785985ffef44 not found: ID does not exist" Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.431667 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-24pvd"] Jan 22 14:19:18 crc kubenswrapper[4801]: I0122 14:19:18.434239 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-24pvd"] Jan 22 14:19:19 crc kubenswrapper[4801]: I0122 14:19:19.584929 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" path="/var/lib/kubelet/pods/0ed10c24-b948-45b3-8ff3-8ab9205ccbbb/volumes" Jan 22 14:19:21 crc kubenswrapper[4801]: I0122 14:19:21.690973 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:21 crc kubenswrapper[4801]: I0122 14:19:21.691314 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:21 crc kubenswrapper[4801]: I0122 14:19:21.752570 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:22 crc kubenswrapper[4801]: I0122 14:19:22.466535 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:24 crc kubenswrapper[4801]: I0122 14:19:24.053667 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dkll9"] Jan 22 14:19:24 crc kubenswrapper[4801]: I0122 14:19:24.438850 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dkll9" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="registry-server" containerID="cri-o://a629df026447392deb94e67c41b07e034bb00ca9b2ea6145531ee58169f0ee46" gracePeriod=2 Jan 22 14:19:25 crc kubenswrapper[4801]: I0122 14:19:25.453733 4801 generic.go:334] "Generic (PLEG): container finished" podID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerID="a629df026447392deb94e67c41b07e034bb00ca9b2ea6145531ee58169f0ee46" exitCode=0 Jan 22 14:19:25 crc kubenswrapper[4801]: I0122 14:19:25.453848 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkll9" event={"ID":"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1","Type":"ContainerDied","Data":"a629df026447392deb94e67c41b07e034bb00ca9b2ea6145531ee58169f0ee46"} Jan 22 14:19:26 crc kubenswrapper[4801]: I0122 14:19:26.186063 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:26 crc kubenswrapper[4801]: I0122 14:19:26.186124 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:26 crc kubenswrapper[4801]: I0122 14:19:26.214091 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:26 crc kubenswrapper[4801]: I0122 14:19:26.491391 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xrv9d" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.222467 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.303928 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t"] Jan 22 14:19:28 crc kubenswrapper[4801]: E0122 14:19:28.304360 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" containerName="registry-server" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.304377 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" containerName="registry-server" Jan 22 14:19:28 crc kubenswrapper[4801]: E0122 14:19:28.304387 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="registry-server" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.304411 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="registry-server" Jan 22 14:19:28 crc kubenswrapper[4801]: E0122 14:19:28.304425 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="extract-utilities" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.304432 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="extract-utilities" Jan 22 14:19:28 crc kubenswrapper[4801]: E0122 14:19:28.304443 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="extract-content" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.304473 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="extract-content" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.304600 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed10c24-b948-45b3-8ff3-8ab9205ccbbb" containerName="registry-server" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.304635 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" containerName="registry-server" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.305623 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.308101 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kx7f2" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.315484 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t"] Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.375606 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zjsh\" (UniqueName: \"kubernetes.io/projected/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-kube-api-access-6zjsh\") pod \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.375666 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-utilities\") pod \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.375778 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-catalog-content\") pod \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\" (UID: \"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1\") " Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.377258 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-utilities" (OuterVolumeSpecName: "utilities") pod "0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" (UID: "0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.383757 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-kube-api-access-6zjsh" (OuterVolumeSpecName: "kube-api-access-6zjsh") pod "0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" (UID: "0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1"). InnerVolumeSpecName "kube-api-access-6zjsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.430171 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" (UID: "0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.473637 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkll9" event={"ID":"0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1","Type":"ContainerDied","Data":"849899feadf091203a9582a627e67e5c72585336d753f2496395624ccb9ef36a"} Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.473706 4801 scope.go:117] "RemoveContainer" containerID="a629df026447392deb94e67c41b07e034bb00ca9b2ea6145531ee58169f0ee46" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.474007 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkll9" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.479217 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-bundle\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.479255 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-util\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.479287 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44xd\" (UniqueName: \"kubernetes.io/projected/9c65ce77-9de5-4b7e-a5ee-668640034790-kube-api-access-w44xd\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.479364 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zjsh\" (UniqueName: \"kubernetes.io/projected/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-kube-api-access-6zjsh\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.479377 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.479387 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.493356 4801 scope.go:117] "RemoveContainer" containerID="fc09efd38b67ed0ae8d85327a67484703d533857359e61486ccedcb4ac8d0ebe" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.510250 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dkll9"] Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.511725 4801 scope.go:117] "RemoveContainer" containerID="1ad29790abd9a95818ac8477e537068ea4baef8f6e4f830448c281dbaaeeabb2" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.516240 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dkll9"] Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.580561 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-bundle\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.580647 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-util\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.580677 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44xd\" (UniqueName: \"kubernetes.io/projected/9c65ce77-9de5-4b7e-a5ee-668640034790-kube-api-access-w44xd\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.582125 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-util\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.582151 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-bundle\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.596324 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44xd\" (UniqueName: \"kubernetes.io/projected/9c65ce77-9de5-4b7e-a5ee-668640034790-kube-api-access-w44xd\") pod \"7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:28 crc kubenswrapper[4801]: I0122 14:19:28.622924 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:29 crc kubenswrapper[4801]: I0122 14:19:29.028618 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t"] Jan 22 14:19:29 crc kubenswrapper[4801]: I0122 14:19:29.491917 4801 generic.go:334] "Generic (PLEG): container finished" podID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerID="90284776c47e99aa4715072b8e14e9b3e50abba11baa5b599b688f6aa56f26fe" exitCode=0 Jan 22 14:19:29 crc kubenswrapper[4801]: I0122 14:19:29.492013 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" event={"ID":"9c65ce77-9de5-4b7e-a5ee-668640034790","Type":"ContainerDied","Data":"90284776c47e99aa4715072b8e14e9b3e50abba11baa5b599b688f6aa56f26fe"} Jan 22 14:19:29 crc kubenswrapper[4801]: I0122 14:19:29.492050 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" event={"ID":"9c65ce77-9de5-4b7e-a5ee-668640034790","Type":"ContainerStarted","Data":"bc147d777384b9d71bb722dc57e31dbc51ca318614f2098ab8130b92853a2d36"} Jan 22 14:19:29 crc kubenswrapper[4801]: I0122 14:19:29.578135 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1" path="/var/lib/kubelet/pods/0c4b5f88-2dbb-410e-a9bb-10c55e6d52c1/volumes" Jan 22 14:19:30 crc kubenswrapper[4801]: I0122 14:19:30.507764 4801 generic.go:334] "Generic (PLEG): container finished" podID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerID="3b6aa1ffb53fa5ca2a58c012e71a0d827b113de0dbda771edde0cabea9f9559d" exitCode=0 Jan 22 14:19:30 crc kubenswrapper[4801]: I0122 14:19:30.507836 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" event={"ID":"9c65ce77-9de5-4b7e-a5ee-668640034790","Type":"ContainerDied","Data":"3b6aa1ffb53fa5ca2a58c012e71a0d827b113de0dbda771edde0cabea9f9559d"} Jan 22 14:19:31 crc kubenswrapper[4801]: I0122 14:19:31.517136 4801 generic.go:334] "Generic (PLEG): container finished" podID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerID="5eb1a8cae4378cb80bbf6ca54e16f777e1d99a4f5fa77b2b1111db78623fdb18" exitCode=0 Jan 22 14:19:31 crc kubenswrapper[4801]: I0122 14:19:31.517232 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" event={"ID":"9c65ce77-9de5-4b7e-a5ee-668640034790","Type":"ContainerDied","Data":"5eb1a8cae4378cb80bbf6ca54e16f777e1d99a4f5fa77b2b1111db78623fdb18"} Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.834243 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.933957 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w44xd\" (UniqueName: \"kubernetes.io/projected/9c65ce77-9de5-4b7e-a5ee-668640034790-kube-api-access-w44xd\") pod \"9c65ce77-9de5-4b7e-a5ee-668640034790\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.934407 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-util\") pod \"9c65ce77-9de5-4b7e-a5ee-668640034790\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.934492 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-bundle\") pod \"9c65ce77-9de5-4b7e-a5ee-668640034790\" (UID: \"9c65ce77-9de5-4b7e-a5ee-668640034790\") " Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.935400 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-bundle" (OuterVolumeSpecName: "bundle") pod "9c65ce77-9de5-4b7e-a5ee-668640034790" (UID: "9c65ce77-9de5-4b7e-a5ee-668640034790"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.942413 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c65ce77-9de5-4b7e-a5ee-668640034790-kube-api-access-w44xd" (OuterVolumeSpecName: "kube-api-access-w44xd") pod "9c65ce77-9de5-4b7e-a5ee-668640034790" (UID: "9c65ce77-9de5-4b7e-a5ee-668640034790"). InnerVolumeSpecName "kube-api-access-w44xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:19:32 crc kubenswrapper[4801]: I0122 14:19:32.951261 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-util" (OuterVolumeSpecName: "util") pod "9c65ce77-9de5-4b7e-a5ee-668640034790" (UID: "9c65ce77-9de5-4b7e-a5ee-668640034790"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:19:33 crc kubenswrapper[4801]: I0122 14:19:33.036354 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w44xd\" (UniqueName: \"kubernetes.io/projected/9c65ce77-9de5-4b7e-a5ee-668640034790-kube-api-access-w44xd\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:33 crc kubenswrapper[4801]: I0122 14:19:33.036792 4801 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-util\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:33 crc kubenswrapper[4801]: I0122 14:19:33.036890 4801 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c65ce77-9de5-4b7e-a5ee-668640034790-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 14:19:33 crc kubenswrapper[4801]: I0122 14:19:33.531132 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" event={"ID":"9c65ce77-9de5-4b7e-a5ee-668640034790","Type":"ContainerDied","Data":"bc147d777384b9d71bb722dc57e31dbc51ca318614f2098ab8130b92853a2d36"} Jan 22 14:19:33 crc kubenswrapper[4801]: I0122 14:19:33.531188 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc147d777384b9d71bb722dc57e31dbc51ca318614f2098ab8130b92853a2d36" Jan 22 14:19:33 crc kubenswrapper[4801]: I0122 14:19:33.531258 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t" Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.020938 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.021601 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.021664 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.022258 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61d4928d4fe513fa3ba1966842778daa9adc983474924e63b516aff418f3aa6f"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.022309 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://61d4928d4fe513fa3ba1966842778daa9adc983474924e63b516aff418f3aa6f" gracePeriod=600 Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.540137 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="61d4928d4fe513fa3ba1966842778daa9adc983474924e63b516aff418f3aa6f" exitCode=0 Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.540198 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"61d4928d4fe513fa3ba1966842778daa9adc983474924e63b516aff418f3aa6f"} Jan 22 14:19:34 crc kubenswrapper[4801]: I0122 14:19:34.540235 4801 scope.go:117] "RemoveContainer" containerID="60a49b0ab9a653bb3640000258ad0704ac97f382b39d4afd913a38b2e74008da" Jan 22 14:19:35 crc kubenswrapper[4801]: I0122 14:19:35.550435 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"749e24075f70012784f8fbfbfdda757c7090725e8db804765c8486f57c8e62bc"} Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.320267 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t"] Jan 22 14:19:37 crc kubenswrapper[4801]: E0122 14:19:37.321113 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="pull" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.321131 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="pull" Jan 22 14:19:37 crc kubenswrapper[4801]: E0122 14:19:37.321149 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="extract" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.321158 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="extract" Jan 22 14:19:37 crc kubenswrapper[4801]: E0122 14:19:37.321166 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="util" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.321173 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="util" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.321297 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c65ce77-9de5-4b7e-a5ee-668640034790" containerName="extract" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.321849 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.325908 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-zwh86" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.382299 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t"] Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.500844 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zc2c\" (UniqueName: \"kubernetes.io/projected/0fa19eb8-dac2-48b6-b5f3-71ce817b8d33-kube-api-access-5zc2c\") pod \"openstack-operator-controller-init-9ff957fff-prp2t\" (UID: \"0fa19eb8-dac2-48b6-b5f3-71ce817b8d33\") " pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.602274 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zc2c\" (UniqueName: \"kubernetes.io/projected/0fa19eb8-dac2-48b6-b5f3-71ce817b8d33-kube-api-access-5zc2c\") pod \"openstack-operator-controller-init-9ff957fff-prp2t\" (UID: \"0fa19eb8-dac2-48b6-b5f3-71ce817b8d33\") " pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.621441 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zc2c\" (UniqueName: \"kubernetes.io/projected/0fa19eb8-dac2-48b6-b5f3-71ce817b8d33-kube-api-access-5zc2c\") pod \"openstack-operator-controller-init-9ff957fff-prp2t\" (UID: \"0fa19eb8-dac2-48b6-b5f3-71ce817b8d33\") " pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:19:37 crc kubenswrapper[4801]: I0122 14:19:37.640108 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:19:38 crc kubenswrapper[4801]: I0122 14:19:38.076224 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t"] Jan 22 14:19:38 crc kubenswrapper[4801]: W0122 14:19:38.096288 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa19eb8_dac2_48b6_b5f3_71ce817b8d33.slice/crio-63effd3801686017d79ec833699b8aedf480afd172e920a21540c318b4846c94 WatchSource:0}: Error finding container 63effd3801686017d79ec833699b8aedf480afd172e920a21540c318b4846c94: Status 404 returned error can't find the container with id 63effd3801686017d79ec833699b8aedf480afd172e920a21540c318b4846c94 Jan 22 14:19:38 crc kubenswrapper[4801]: I0122 14:19:38.571223 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" event={"ID":"0fa19eb8-dac2-48b6-b5f3-71ce817b8d33","Type":"ContainerStarted","Data":"63effd3801686017d79ec833699b8aedf480afd172e920a21540c318b4846c94"} Jan 22 14:19:43 crc kubenswrapper[4801]: I0122 14:19:43.612855 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" event={"ID":"0fa19eb8-dac2-48b6-b5f3-71ce817b8d33","Type":"ContainerStarted","Data":"7705c499e007701ec99dbd74dd2bea34fe7d5593489cc3754af5855e5b36a72b"} Jan 22 14:19:43 crc kubenswrapper[4801]: I0122 14:19:43.613409 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:19:43 crc kubenswrapper[4801]: I0122 14:19:43.643561 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" podStartSLOduration=1.842214204 podStartE2EDuration="6.643540348s" podCreationTimestamp="2026-01-22 14:19:37 +0000 UTC" firstStartedPulling="2026-01-22 14:19:38.098814162 +0000 UTC m=+926.800714335" lastFinishedPulling="2026-01-22 14:19:42.900140296 +0000 UTC m=+931.602040479" observedRunningTime="2026-01-22 14:19:43.637476755 +0000 UTC m=+932.339376948" watchObservedRunningTime="2026-01-22 14:19:43.643540348 +0000 UTC m=+932.345440531" Jan 22 14:19:57 crc kubenswrapper[4801]: I0122 14:19:57.642470 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-9ff957fff-prp2t" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.683037 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.686544 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.689543 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7nmvp" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.690521 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.691405 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.692740 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pc2ng" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.705986 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.713990 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.732231 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.733110 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.737467 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6p7mp" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.749287 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.750433 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.754923 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hfg8m" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.758258 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.778922 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.784317 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.785707 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.805344 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-62csz" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.805880 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.821909 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.825773 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fn7m5" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.839529 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7z86\" (UniqueName: \"kubernetes.io/projected/3d1aad44-ff2f-4309-9984-7da6200820ac-kube-api-access-x7z86\") pod \"designate-operator-controller-manager-b45d7bf98-z9z4g\" (UID: \"3d1aad44-ff2f-4309-9984-7da6200820ac\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.839652 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbldr\" (UniqueName: \"kubernetes.io/projected/f9e65da4-cf22-44e4-8e0e-f33aee5413ba-kube-api-access-lbldr\") pod \"cinder-operator-controller-manager-69cf5d4557-jjdzm\" (UID: \"f9e65da4-cf22-44e4-8e0e-f33aee5413ba\") " pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.839693 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jchx\" (UniqueName: \"kubernetes.io/projected/62197d9f-468d-4a38-8b24-b4639ec0b1c0-kube-api-access-7jchx\") pod \"barbican-operator-controller-manager-59dd8b7cbf-mx4m5\" (UID: \"62197d9f-468d-4a38-8b24-b4639ec0b1c0\") " pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.888387 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.889463 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.891175 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.891579 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nsl2f" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.909371 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.924728 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.925536 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.928939 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-ppdrp" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.937186 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.938221 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.940591 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2kln2" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.941889 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7z86\" (UniqueName: \"kubernetes.io/projected/3d1aad44-ff2f-4309-9984-7da6200820ac-kube-api-access-x7z86\") pod \"designate-operator-controller-manager-b45d7bf98-z9z4g\" (UID: \"3d1aad44-ff2f-4309-9984-7da6200820ac\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.941960 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbldr\" (UniqueName: \"kubernetes.io/projected/f9e65da4-cf22-44e4-8e0e-f33aee5413ba-kube-api-access-lbldr\") pod \"cinder-operator-controller-manager-69cf5d4557-jjdzm\" (UID: \"f9e65da4-cf22-44e4-8e0e-f33aee5413ba\") " pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.941996 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc4ct\" (UniqueName: \"kubernetes.io/projected/e05677d8-9dbb-489c-9eb8-84ff6981776c-kube-api-access-gc4ct\") pod \"horizon-operator-controller-manager-77d5c5b54f-f2wt7\" (UID: \"e05677d8-9dbb-489c-9eb8-84ff6981776c\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.942022 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdv5\" (UniqueName: \"kubernetes.io/projected/31afca8d-2c0d-419c-ba85-de05d929490b-kube-api-access-bgdv5\") pod \"glance-operator-controller-manager-78fdd796fd-9lj8x\" (UID: \"31afca8d-2c0d-419c-ba85-de05d929490b\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.942049 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxmg9\" (UniqueName: \"kubernetes.io/projected/e1799966-64ec-4e45-b7a5-45715376926c-kube-api-access-fxmg9\") pod \"heat-operator-controller-manager-594c8c9d5d-g8z77\" (UID: \"e1799966-64ec-4e45-b7a5-45715376926c\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.942079 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jchx\" (UniqueName: \"kubernetes.io/projected/62197d9f-468d-4a38-8b24-b4639ec0b1c0-kube-api-access-7jchx\") pod \"barbican-operator-controller-manager-59dd8b7cbf-mx4m5\" (UID: \"62197d9f-468d-4a38-8b24-b4639ec0b1c0\") " pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.955337 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.965501 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.969073 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.974720 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.984299 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.984980 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj"] Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.985467 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.985775 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.988823 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-7c8tg" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.988982 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8x824" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.989259 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbldr\" (UniqueName: \"kubernetes.io/projected/f9e65da4-cf22-44e4-8e0e-f33aee5413ba-kube-api-access-lbldr\") pod \"cinder-operator-controller-manager-69cf5d4557-jjdzm\" (UID: \"f9e65da4-cf22-44e4-8e0e-f33aee5413ba\") " pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.990735 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jchx\" (UniqueName: \"kubernetes.io/projected/62197d9f-468d-4a38-8b24-b4639ec0b1c0-kube-api-access-7jchx\") pod \"barbican-operator-controller-manager-59dd8b7cbf-mx4m5\" (UID: \"62197d9f-468d-4a38-8b24-b4639ec0b1c0\") " pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:20:20 crc kubenswrapper[4801]: I0122 14:20:20.992115 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7z86\" (UniqueName: \"kubernetes.io/projected/3d1aad44-ff2f-4309-9984-7da6200820ac-kube-api-access-x7z86\") pod \"designate-operator-controller-manager-b45d7bf98-z9z4g\" (UID: \"3d1aad44-ff2f-4309-9984-7da6200820ac\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.000512 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.001580 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.005775 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tdtz4" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.014784 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.025294 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.029782 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047680 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ng66\" (UniqueName: \"kubernetes.io/projected/046a1aec-268f-4b7e-9644-572026853eaa-kube-api-access-5ng66\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047754 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc4ct\" (UniqueName: \"kubernetes.io/projected/e05677d8-9dbb-489c-9eb8-84ff6981776c-kube-api-access-gc4ct\") pod \"horizon-operator-controller-manager-77d5c5b54f-f2wt7\" (UID: \"e05677d8-9dbb-489c-9eb8-84ff6981776c\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047773 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdv5\" (UniqueName: \"kubernetes.io/projected/31afca8d-2c0d-419c-ba85-de05d929490b-kube-api-access-bgdv5\") pod \"glance-operator-controller-manager-78fdd796fd-9lj8x\" (UID: \"31afca8d-2c0d-419c-ba85-de05d929490b\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047791 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxmg9\" (UniqueName: \"kubernetes.io/projected/e1799966-64ec-4e45-b7a5-45715376926c-kube-api-access-fxmg9\") pod \"heat-operator-controller-manager-594c8c9d5d-g8z77\" (UID: \"e1799966-64ec-4e45-b7a5-45715376926c\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047825 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047864 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgrtg\" (UniqueName: \"kubernetes.io/projected/8bd2b2a1-52ee-41f6-ada0-652b74c4542b-kube-api-access-jgrtg\") pod \"ironic-operator-controller-manager-69d6c9f5b8-v4msq\" (UID: \"8bd2b2a1-52ee-41f6-ada0-652b74c4542b\") " pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.047913 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jsnm\" (UniqueName: \"kubernetes.io/projected/bec5ba53-958e-48be-af46-cf073df0d161-kube-api-access-8jsnm\") pod \"keystone-operator-controller-manager-b8b6d4659-dzf5f\" (UID: \"bec5ba53-958e-48be-af46-cf073df0d161\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.057756 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.069388 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.093641 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.094719 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.095317 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.100060 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.102865 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.110155 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c6zzg" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.124912 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.127786 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jjkf9" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.128123 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxmg9\" (UniqueName: \"kubernetes.io/projected/e1799966-64ec-4e45-b7a5-45715376926c-kube-api-access-fxmg9\") pod \"heat-operator-controller-manager-594c8c9d5d-g8z77\" (UID: \"e1799966-64ec-4e45-b7a5-45715376926c\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.129308 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc4ct\" (UniqueName: \"kubernetes.io/projected/e05677d8-9dbb-489c-9eb8-84ff6981776c-kube-api-access-gc4ct\") pod \"horizon-operator-controller-manager-77d5c5b54f-f2wt7\" (UID: \"e05677d8-9dbb-489c-9eb8-84ff6981776c\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.157586 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.159870 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ng66\" (UniqueName: \"kubernetes.io/projected/046a1aec-268f-4b7e-9644-572026853eaa-kube-api-access-5ng66\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.159964 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqtdd\" (UniqueName: \"kubernetes.io/projected/d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43-kube-api-access-rqtdd\") pod \"manila-operator-controller-manager-78c6999f6f-rlqgc\" (UID: \"d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.160000 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.160032 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgrtg\" (UniqueName: \"kubernetes.io/projected/8bd2b2a1-52ee-41f6-ada0-652b74c4542b-kube-api-access-jgrtg\") pod \"ironic-operator-controller-manager-69d6c9f5b8-v4msq\" (UID: \"8bd2b2a1-52ee-41f6-ada0-652b74c4542b\") " pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.160066 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmtsn\" (UniqueName: \"kubernetes.io/projected/def6ced9-f64a-428e-ae30-6b1f97648c5f-kube-api-access-bmtsn\") pod \"neutron-operator-controller-manager-5d8f59fb49-cthv6\" (UID: \"def6ced9-f64a-428e-ae30-6b1f97648c5f\") " pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.160101 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lb6\" (UniqueName: \"kubernetes.io/projected/1fed9af5-123f-4db7-84d2-192d22c9024f-kube-api-access-48lb6\") pod \"mariadb-operator-controller-manager-c87fff755-fphcj\" (UID: \"1fed9af5-123f-4db7-84d2-192d22c9024f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.160126 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jsnm\" (UniqueName: \"kubernetes.io/projected/bec5ba53-958e-48be-af46-cf073df0d161-kube-api-access-8jsnm\") pod \"keystone-operator-controller-manager-b8b6d4659-dzf5f\" (UID: \"bec5ba53-958e-48be-af46-cf073df0d161\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:20:21 crc kubenswrapper[4801]: E0122 14:20:21.160285 4801 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:21 crc kubenswrapper[4801]: E0122 14:20:21.160371 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert podName:046a1aec-268f-4b7e-9644-572026853eaa nodeName:}" failed. No retries permitted until 2026-01-22 14:20:21.660339603 +0000 UTC m=+970.362239786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert") pod "infra-operator-controller-manager-54ccf4f85d-ffdkv" (UID: "046a1aec-268f-4b7e-9644-572026853eaa") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.177659 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.191920 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.203162 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdv5\" (UniqueName: \"kubernetes.io/projected/31afca8d-2c0d-419c-ba85-de05d929490b-kube-api-access-bgdv5\") pod \"glance-operator-controller-manager-78fdd796fd-9lj8x\" (UID: \"31afca8d-2c0d-419c-ba85-de05d929490b\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.203702 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgrtg\" (UniqueName: \"kubernetes.io/projected/8bd2b2a1-52ee-41f6-ada0-652b74c4542b-kube-api-access-jgrtg\") pod \"ironic-operator-controller-manager-69d6c9f5b8-v4msq\" (UID: \"8bd2b2a1-52ee-41f6-ada0-652b74c4542b\") " pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.216284 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jsnm\" (UniqueName: \"kubernetes.io/projected/bec5ba53-958e-48be-af46-cf073df0d161-kube-api-access-8jsnm\") pod \"keystone-operator-controller-manager-b8b6d4659-dzf5f\" (UID: \"bec5ba53-958e-48be-af46-cf073df0d161\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.217723 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.218660 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.252925 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.253209 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-vcrwl" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.253547 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.254285 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.259113 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ng66\" (UniqueName: \"kubernetes.io/projected/046a1aec-268f-4b7e-9644-572026853eaa-kube-api-access-5ng66\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.331975 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rntmn" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.332780 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.333571 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqtdd\" (UniqueName: \"kubernetes.io/projected/d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43-kube-api-access-rqtdd\") pod \"manila-operator-controller-manager-78c6999f6f-rlqgc\" (UID: \"d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.333617 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmtsn\" (UniqueName: \"kubernetes.io/projected/def6ced9-f64a-428e-ae30-6b1f97648c5f-kube-api-access-bmtsn\") pod \"neutron-operator-controller-manager-5d8f59fb49-cthv6\" (UID: \"def6ced9-f64a-428e-ae30-6b1f97648c5f\") " pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.333643 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lb6\" (UniqueName: \"kubernetes.io/projected/1fed9af5-123f-4db7-84d2-192d22c9024f-kube-api-access-48lb6\") pod \"mariadb-operator-controller-manager-c87fff755-fphcj\" (UID: \"1fed9af5-123f-4db7-84d2-192d22c9024f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.333674 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqgdq\" (UniqueName: \"kubernetes.io/projected/c5890193-a2af-432d-b66b-62f480c03768-kube-api-access-rqgdq\") pod \"nova-operator-controller-manager-6b8bc8d87d-ss9zg\" (UID: \"c5890193-a2af-432d-b66b-62f480c03768\") " pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.333698 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2gqh\" (UniqueName: \"kubernetes.io/projected/dd6ed712-5542-4847-af9b-a536b29000b2-kube-api-access-j2gqh\") pod \"octavia-operator-controller-manager-7bd9774b6-d4lmf\" (UID: \"dd6ed712-5542-4847-af9b-a536b29000b2\") " pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.334526 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.360505 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.370692 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.376020 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lb6\" (UniqueName: \"kubernetes.io/projected/1fed9af5-123f-4db7-84d2-192d22c9024f-kube-api-access-48lb6\") pod \"mariadb-operator-controller-manager-c87fff755-fphcj\" (UID: \"1fed9af5-123f-4db7-84d2-192d22c9024f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.376428 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.388699 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqtdd\" (UniqueName: \"kubernetes.io/projected/d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43-kube-api-access-rqtdd\") pod \"manila-operator-controller-manager-78c6999f6f-rlqgc\" (UID: \"d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.388769 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.389592 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.394009 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmtsn\" (UniqueName: \"kubernetes.io/projected/def6ced9-f64a-428e-ae30-6b1f97648c5f-kube-api-access-bmtsn\") pod \"neutron-operator-controller-manager-5d8f59fb49-cthv6\" (UID: \"def6ced9-f64a-428e-ae30-6b1f97648c5f\") " pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.402912 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mj55b" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.413558 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.422802 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.435283 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.436943 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97l5f\" (UniqueName: \"kubernetes.io/projected/da0d6842-c349-4dfb-8b0d-777cabdc8941-kube-api-access-97l5f\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.437008 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkljh\" (UniqueName: \"kubernetes.io/projected/552434b1-06fc-4f26-972f-e53d29f623e9-kube-api-access-pkljh\") pod \"ovn-operator-controller-manager-55db956ddc-xlsls\" (UID: \"552434b1-06fc-4f26-972f-e53d29f623e9\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.437031 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.437089 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqgdq\" (UniqueName: \"kubernetes.io/projected/c5890193-a2af-432d-b66b-62f480c03768-kube-api-access-rqgdq\") pod \"nova-operator-controller-manager-6b8bc8d87d-ss9zg\" (UID: \"c5890193-a2af-432d-b66b-62f480c03768\") " pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.437116 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2gqh\" (UniqueName: \"kubernetes.io/projected/dd6ed712-5542-4847-af9b-a536b29000b2-kube-api-access-j2gqh\") pod \"octavia-operator-controller-manager-7bd9774b6-d4lmf\" (UID: \"dd6ed712-5542-4847-af9b-a536b29000b2\") " pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.439372 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.447735 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-r2ttv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.459912 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.493232 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.505310 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqgdq\" (UniqueName: \"kubernetes.io/projected/c5890193-a2af-432d-b66b-62f480c03768-kube-api-access-rqgdq\") pod \"nova-operator-controller-manager-6b8bc8d87d-ss9zg\" (UID: \"c5890193-a2af-432d-b66b-62f480c03768\") " pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.518008 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2gqh\" (UniqueName: \"kubernetes.io/projected/dd6ed712-5542-4847-af9b-a536b29000b2-kube-api-access-j2gqh\") pod \"octavia-operator-controller-manager-7bd9774b6-d4lmf\" (UID: \"dd6ed712-5542-4847-af9b-a536b29000b2\") " pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.535629 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.549717 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.552374 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vdj\" (UniqueName: \"kubernetes.io/projected/7dd65530-964d-4b59-a201-023f968c59f8-kube-api-access-w7vdj\") pod \"swift-operator-controller-manager-547cbdb99f-t8r9r\" (UID: \"7dd65530-964d-4b59-a201-023f968c59f8\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.552439 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97l5f\" (UniqueName: \"kubernetes.io/projected/da0d6842-c349-4dfb-8b0d-777cabdc8941-kube-api-access-97l5f\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.552498 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwsb\" (UniqueName: \"kubernetes.io/projected/697b4f23-f1b2-4c36-b453-38bff993b462-kube-api-access-5dwsb\") pod \"placement-operator-controller-manager-5d646b7d76-tw2h5\" (UID: \"697b4f23-f1b2-4c36-b453-38bff993b462\") " pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.552528 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkljh\" (UniqueName: \"kubernetes.io/projected/552434b1-06fc-4f26-972f-e53d29f623e9-kube-api-access-pkljh\") pod \"ovn-operator-controller-manager-55db956ddc-xlsls\" (UID: \"552434b1-06fc-4f26-972f-e53d29f623e9\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.552548 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:21 crc kubenswrapper[4801]: E0122 14:20:21.552680 4801 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:21 crc kubenswrapper[4801]: E0122 14:20:21.552735 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert podName:da0d6842-c349-4dfb-8b0d-777cabdc8941 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:22.052714912 +0000 UTC m=+970.754615095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" (UID: "da0d6842-c349-4dfb-8b0d-777cabdc8941") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.560007 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-sn6tr" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.623208 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkljh\" (UniqueName: \"kubernetes.io/projected/552434b1-06fc-4f26-972f-e53d29f623e9-kube-api-access-pkljh\") pod \"ovn-operator-controller-manager-55db956ddc-xlsls\" (UID: \"552434b1-06fc-4f26-972f-e53d29f623e9\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.636674 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.641437 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97l5f\" (UniqueName: \"kubernetes.io/projected/da0d6842-c349-4dfb-8b0d-777cabdc8941-kube-api-access-97l5f\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.662529 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.665592 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.665634 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h265\" (UniqueName: \"kubernetes.io/projected/bd9524a3-7561-4e18-84f7-44bd95a5475c-kube-api-access-6h265\") pod \"test-operator-controller-manager-69797bbcbd-tvbhx\" (UID: \"bd9524a3-7561-4e18-84f7-44bd95a5475c\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.665668 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwsb\" (UniqueName: \"kubernetes.io/projected/697b4f23-f1b2-4c36-b453-38bff993b462-kube-api-access-5dwsb\") pod \"placement-operator-controller-manager-5d646b7d76-tw2h5\" (UID: \"697b4f23-f1b2-4c36-b453-38bff993b462\") " pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.665749 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgnxl\" (UniqueName: \"kubernetes.io/projected/5f1bd071-a488-4023-a2fd-7e8a8b0c998a-kube-api-access-dgnxl\") pod \"telemetry-operator-controller-manager-85cd9769bb-h6dmw\" (UID: \"5f1bd071-a488-4023-a2fd-7e8a8b0c998a\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.665829 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vdj\" (UniqueName: \"kubernetes.io/projected/7dd65530-964d-4b59-a201-023f968c59f8-kube-api-access-w7vdj\") pod \"swift-operator-controller-manager-547cbdb99f-t8r9r\" (UID: \"7dd65530-964d-4b59-a201-023f968c59f8\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:20:21 crc kubenswrapper[4801]: E0122 14:20:21.666232 4801 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:21 crc kubenswrapper[4801]: E0122 14:20:21.666318 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert podName:046a1aec-268f-4b7e-9644-572026853eaa nodeName:}" failed. No retries permitted until 2026-01-22 14:20:22.666292125 +0000 UTC m=+971.368192378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert") pod "infra-operator-controller-manager-54ccf4f85d-ffdkv" (UID: "046a1aec-268f-4b7e-9644-572026853eaa") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.672933 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.681129 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ctzhw" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.688540 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.701201 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vdj\" (UniqueName: \"kubernetes.io/projected/7dd65530-964d-4b59-a201-023f968c59f8-kube-api-access-w7vdj\") pod \"swift-operator-controller-manager-547cbdb99f-t8r9r\" (UID: \"7dd65530-964d-4b59-a201-023f968c59f8\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.718016 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.727294 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.742919 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwsb\" (UniqueName: \"kubernetes.io/projected/697b4f23-f1b2-4c36-b453-38bff993b462-kube-api-access-5dwsb\") pod \"placement-operator-controller-manager-5d646b7d76-tw2h5\" (UID: \"697b4f23-f1b2-4c36-b453-38bff993b462\") " pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.749691 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.767347 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h265\" (UniqueName: \"kubernetes.io/projected/bd9524a3-7561-4e18-84f7-44bd95a5475c-kube-api-access-6h265\") pod \"test-operator-controller-manager-69797bbcbd-tvbhx\" (UID: \"bd9524a3-7561-4e18-84f7-44bd95a5475c\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.767438 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgnxl\" (UniqueName: \"kubernetes.io/projected/5f1bd071-a488-4023-a2fd-7e8a8b0c998a-kube-api-access-dgnxl\") pod \"telemetry-operator-controller-manager-85cd9769bb-h6dmw\" (UID: \"5f1bd071-a488-4023-a2fd-7e8a8b0c998a\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.777072 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.786539 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgnxl\" (UniqueName: \"kubernetes.io/projected/5f1bd071-a488-4023-a2fd-7e8a8b0c998a-kube-api-access-dgnxl\") pod \"telemetry-operator-controller-manager-85cd9769bb-h6dmw\" (UID: \"5f1bd071-a488-4023-a2fd-7e8a8b0c998a\") " pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.813800 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h265\" (UniqueName: \"kubernetes.io/projected/bd9524a3-7561-4e18-84f7-44bd95a5475c-kube-api-access-6h265\") pod \"test-operator-controller-manager-69797bbcbd-tvbhx\" (UID: \"bd9524a3-7561-4e18-84f7-44bd95a5475c\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.819685 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.819799 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.828762 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vxxzr" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.864200 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.875490 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.876567 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.879871 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-cbkjd" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.880025 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.881063 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.885213 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsp5b\" (UniqueName: \"kubernetes.io/projected/e3932e04-5349-4181-8019-f651145fa996-kube-api-access-wsp5b\") pod \"watcher-operator-controller-manager-5ffb9c6597-zmpxx\" (UID: \"e3932e04-5349-4181-8019-f651145fa996\") " pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.912881 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.912981 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.914263 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.924038 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-29ngc" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.931571 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr"] Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.964541 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.986065 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4qvt\" (UniqueName: \"kubernetes.io/projected/cb47af42-6498-44e4-8a55-04e5fff23296-kube-api-access-t4qvt\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.986106 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7v5l\" (UniqueName: \"kubernetes.io/projected/00d655ad-5646-42b8-baf8-f02ee28ab4ac-kube-api-access-p7v5l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-x7wxr\" (UID: \"00d655ad-5646-42b8-baf8-f02ee28ab4ac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.986127 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.986154 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsp5b\" (UniqueName: \"kubernetes.io/projected/e3932e04-5349-4181-8019-f651145fa996-kube-api-access-wsp5b\") pod \"watcher-operator-controller-manager-5ffb9c6597-zmpxx\" (UID: \"e3932e04-5349-4181-8019-f651145fa996\") " pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:20:21 crc kubenswrapper[4801]: I0122 14:20:21.986211 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.000768 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsp5b\" (UniqueName: \"kubernetes.io/projected/e3932e04-5349-4181-8019-f651145fa996-kube-api-access-wsp5b\") pod \"watcher-operator-controller-manager-5ffb9c6597-zmpxx\" (UID: \"e3932e04-5349-4181-8019-f651145fa996\") " pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.007050 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.029410 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.049960 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.088158 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4qvt\" (UniqueName: \"kubernetes.io/projected/cb47af42-6498-44e4-8a55-04e5fff23296-kube-api-access-t4qvt\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.088210 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7v5l\" (UniqueName: \"kubernetes.io/projected/00d655ad-5646-42b8-baf8-f02ee28ab4ac-kube-api-access-p7v5l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-x7wxr\" (UID: \"00d655ad-5646-42b8-baf8-f02ee28ab4ac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.088239 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.088302 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.088341 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.088564 4801 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.088633 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:22.588610566 +0000 UTC m=+971.290510749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "metrics-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.089605 4801 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.089645 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:22.589633335 +0000 UTC m=+971.291533518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.089732 4801 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.090615 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert podName:da0d6842-c349-4dfb-8b0d-777cabdc8941 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:23.08979375 +0000 UTC m=+971.791693933 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" (UID: "da0d6842-c349-4dfb-8b0d-777cabdc8941") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.099628 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g"] Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.116816 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4qvt\" (UniqueName: \"kubernetes.io/projected/cb47af42-6498-44e4-8a55-04e5fff23296-kube-api-access-t4qvt\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.117897 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7v5l\" (UniqueName: \"kubernetes.io/projected/00d655ad-5646-42b8-baf8-f02ee28ab4ac-kube-api-access-p7v5l\") pod \"rabbitmq-cluster-operator-manager-668c99d594-x7wxr\" (UID: \"00d655ad-5646-42b8-baf8-f02ee28ab4ac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.167331 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.282031 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.334839 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f"] Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.352319 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm"] Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.599527 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.599868 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.599980 4801 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.600030 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:23.600013144 +0000 UTC m=+972.301913327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "metrics-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.600287 4801 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.600367 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:23.600344123 +0000 UTC m=+972.302244376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.602110 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5"] Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.635060 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7"] Jan 22 14:20:22 crc kubenswrapper[4801]: W0122 14:20:22.643897 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode05677d8_9dbb_489c_9eb8_84ff6981776c.slice/crio-889cd18e0f05a6114101930fb28b1743f28c474b6651cf5fa1efbb439f4003c7 WatchSource:0}: Error finding container 889cd18e0f05a6114101930fb28b1743f28c474b6651cf5fa1efbb439f4003c7: Status 404 returned error can't find the container with id 889cd18e0f05a6114101930fb28b1743f28c474b6651cf5fa1efbb439f4003c7 Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.702419 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.702724 4801 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: E0122 14:20:22.702774 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert podName:046a1aec-268f-4b7e-9644-572026853eaa nodeName:}" failed. No retries permitted until 2026-01-22 14:20:24.702759058 +0000 UTC m=+973.404659241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert") pod "infra-operator-controller-manager-54ccf4f85d-ffdkv" (UID: "046a1aec-268f-4b7e-9644-572026853eaa") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.725420 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77"] Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.781993 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq"] Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.924118 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" event={"ID":"3d1aad44-ff2f-4309-9984-7da6200820ac","Type":"ContainerStarted","Data":"322acf4895e78087e6ccab76b95029f81176b717c0f17398c53e0a53b1f16f75"} Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.926116 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" event={"ID":"62197d9f-468d-4a38-8b24-b4639ec0b1c0","Type":"ContainerStarted","Data":"d51949a8c4d749e3592c841c7c9a51ae942facc35a39d41bb1d33b7f6abb8600"} Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.927625 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" event={"ID":"e1799966-64ec-4e45-b7a5-45715376926c","Type":"ContainerStarted","Data":"7acfa05013c3d0e1f7a6048d10cfd48a8c4d9119211f39714ae551ed76709b11"} Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.944278 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" event={"ID":"bec5ba53-958e-48be-af46-cf073df0d161","Type":"ContainerStarted","Data":"3f0c6d3df1bc1a0ea71804d034eff460a7a53e937228e2e9fc8fdd95ad887506"} Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.947208 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" event={"ID":"e05677d8-9dbb-489c-9eb8-84ff6981776c","Type":"ContainerStarted","Data":"889cd18e0f05a6114101930fb28b1743f28c474b6651cf5fa1efbb439f4003c7"} Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.952863 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" event={"ID":"8bd2b2a1-52ee-41f6-ada0-652b74c4542b","Type":"ContainerStarted","Data":"082f33e77d089e6bdb9fe539963a66ee2b2293a800f69f254dc1be9df51c9e5e"} Jan 22 14:20:22 crc kubenswrapper[4801]: I0122 14:20:22.956721 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" event={"ID":"f9e65da4-cf22-44e4-8e0e-f33aee5413ba","Type":"ContainerStarted","Data":"585faba66ddbf3dcd4f3ded950f69a02fe136998f2034c11fecf358dd2ce0cf8"} Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.050894 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.058498 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6"] Jan 22 14:20:23 crc kubenswrapper[4801]: W0122 14:20:23.061074 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd64f87cc_2d65_4b7e_a8c4_4fcbbf57ad43.slice/crio-78359ecbe1dc0d2a7158e080e78eb52c51ae1d78734db5153b1030134f3b3a1d WatchSource:0}: Error finding container 78359ecbe1dc0d2a7158e080e78eb52c51ae1d78734db5153b1030134f3b3a1d: Status 404 returned error can't find the container with id 78359ecbe1dc0d2a7158e080e78eb52c51ae1d78734db5153b1030134f3b3a1d Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.064830 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.106740 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.106929 4801 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.107020 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert podName:da0d6842-c349-4dfb-8b0d-777cabdc8941 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:25.106994425 +0000 UTC m=+973.808894658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" (UID: "da0d6842-c349-4dfb-8b0d-777cabdc8941") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.432314 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.434323 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls"] Jan 22 14:20:23 crc kubenswrapper[4801]: W0122 14:20:23.448087 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552434b1_06fc_4f26_972f_e53d29f623e9.slice/crio-c9db01d25424ceea7e539057e2bbab8345568ca3a4130b984f61c18c6de0a52b WatchSource:0}: Error finding container c9db01d25424ceea7e539057e2bbab8345568ca3a4130b984f61c18c6de0a52b: Status 404 returned error can't find the container with id c9db01d25424ceea7e539057e2bbab8345568ca3a4130b984f61c18c6de0a52b Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.456684 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx"] Jan 22 14:20:23 crc kubenswrapper[4801]: W0122 14:20:23.458802 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5890193_a2af_432d_b66b_62f480c03768.slice/crio-006a7a6323a3edc7dbc6c06ce22ad42807335364bfda5b11f6bbbee31e98af7f WatchSource:0}: Error finding container 006a7a6323a3edc7dbc6c06ce22ad42807335364bfda5b11f6bbbee31e98af7f: Status 404 returned error can't find the container with id 006a7a6323a3edc7dbc6c06ce22ad42807335364bfda5b11f6bbbee31e98af7f Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.487290 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.504980 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.514131 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.522343 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.530129 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr"] Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.538302 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw"] Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.543595 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2gqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7bd9774b6-d4lmf_openstack-operators(dd6ed712-5542-4847-af9b-a536b29000b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.544767 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" podUID="dd6ed712-5542-4847-af9b-a536b29000b2" Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.545015 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx"] Jan 22 14:20:23 crc kubenswrapper[4801]: W0122 14:20:23.550772 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3932e04_5349_4181_8019_f651145fa996.slice/crio-7f716cf4ccecfd8c651cbab17577339889829f13a76d1647919417ed8628ab36 WatchSource:0}: Error finding container 7f716cf4ccecfd8c651cbab17577339889829f13a76d1647919417ed8628ab36: Status 404 returned error can't find the container with id 7f716cf4ccecfd8c651cbab17577339889829f13a76d1647919417ed8628ab36 Jan 22 14:20:23 crc kubenswrapper[4801]: W0122 14:20:23.553120 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dd65530_964d_4b59_a201_023f968c59f8.slice/crio-31166d72c66257d8f5c35acd9cc95482d9332ef9bfd83511ae8248aa4353d7c1 WatchSource:0}: Error finding container 31166d72c66257d8f5c35acd9cc95482d9332ef9bfd83511ae8248aa4353d7c1: Status 404 returned error can't find the container with id 31166d72c66257d8f5c35acd9cc95482d9332ef9bfd83511ae8248aa4353d7c1 Jan 22 14:20:23 crc kubenswrapper[4801]: W0122 14:20:23.557670 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d655ad_5646_42b8_baf8_f02ee28ab4ac.slice/crio-452a0ef454626e200b217220c3d7ab1a32adabc47ba8a35a9a9cb7c3ee5d3763 WatchSource:0}: Error finding container 452a0ef454626e200b217220c3d7ab1a32adabc47ba8a35a9a9cb7c3ee5d3763: Status 404 returned error can't find the container with id 452a0ef454626e200b217220c3d7ab1a32adabc47ba8a35a9a9cb7c3ee5d3763 Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.563488 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7v5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-x7wxr_openstack-operators(00d655ad-5646-42b8-baf8-f02ee28ab4ac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.568546 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" podUID="00d655ad-5646-42b8-baf8-f02ee28ab4ac" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.572094 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7vdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-547cbdb99f-t8r9r_openstack-operators(7dd65530-964d-4b59-a201-023f968c59f8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.574283 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" podUID="7dd65530-964d-4b59-a201-023f968c59f8" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.586310 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:2d6d13b3c28e45c6bec980b8808dda8da4723ae87e66d04f53d52c3b3c51612b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsp5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5ffb9c6597-zmpxx_openstack-operators(e3932e04-5349-4181-8019-f651145fa996): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.587839 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" podUID="e3932e04-5349-4181-8019-f651145fa996" Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.628755 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.629015 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.629419 4801 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.629548 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:25.629510919 +0000 UTC m=+974.331411112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "metrics-server-cert" not found Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.630061 4801 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.630095 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:25.630084556 +0000 UTC m=+974.331984739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "webhook-server-cert" not found Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.980096 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" event={"ID":"e3932e04-5349-4181-8019-f651145fa996","Type":"ContainerStarted","Data":"7f716cf4ccecfd8c651cbab17577339889829f13a76d1647919417ed8628ab36"} Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.982744 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:2d6d13b3c28e45c6bec980b8808dda8da4723ae87e66d04f53d52c3b3c51612b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" podUID="e3932e04-5349-4181-8019-f651145fa996" Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.983778 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" event={"ID":"dd6ed712-5542-4847-af9b-a536b29000b2","Type":"ContainerStarted","Data":"9c19569408728217612a0662865a34af3a3bafe1960d98a80573bd7aeec960de"} Jan 22 14:20:23 crc kubenswrapper[4801]: E0122 14:20:23.985199 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" podUID="dd6ed712-5542-4847-af9b-a536b29000b2" Jan 22 14:20:23 crc kubenswrapper[4801]: I0122 14:20:23.986330 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" event={"ID":"bd9524a3-7561-4e18-84f7-44bd95a5475c","Type":"ContainerStarted","Data":"f790961fe52c3223178771441177533d4e5f252d61bb1a48a445a87dc25dd476"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.025235 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" event={"ID":"552434b1-06fc-4f26-972f-e53d29f623e9","Type":"ContainerStarted","Data":"c9db01d25424ceea7e539057e2bbab8345568ca3a4130b984f61c18c6de0a52b"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.028208 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" event={"ID":"d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43","Type":"ContainerStarted","Data":"78359ecbe1dc0d2a7158e080e78eb52c51ae1d78734db5153b1030134f3b3a1d"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.042011 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" event={"ID":"697b4f23-f1b2-4c36-b453-38bff993b462","Type":"ContainerStarted","Data":"57ffa6e81ac47907c95bf468c4ce8534ca632204764e0828febed347fbc724ab"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.047707 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" event={"ID":"5f1bd071-a488-4023-a2fd-7e8a8b0c998a","Type":"ContainerStarted","Data":"e6d4a0c3f2f42677f7dea3f472dc68234f46e6d0131928074eb6f07a842d54e7"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.049199 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" event={"ID":"c5890193-a2af-432d-b66b-62f480c03768","Type":"ContainerStarted","Data":"006a7a6323a3edc7dbc6c06ce22ad42807335364bfda5b11f6bbbee31e98af7f"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.053729 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" event={"ID":"00d655ad-5646-42b8-baf8-f02ee28ab4ac","Type":"ContainerStarted","Data":"452a0ef454626e200b217220c3d7ab1a32adabc47ba8a35a9a9cb7c3ee5d3763"} Jan 22 14:20:24 crc kubenswrapper[4801]: E0122 14:20:24.056847 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" podUID="00d655ad-5646-42b8-baf8-f02ee28ab4ac" Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.065816 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" event={"ID":"31afca8d-2c0d-419c-ba85-de05d929490b","Type":"ContainerStarted","Data":"e4ed64bad2a56d66e6df0a3f8f1b8fbf144dd6dfca16f3fb2d4a21cc90a1aaa3"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.093122 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" event={"ID":"def6ced9-f64a-428e-ae30-6b1f97648c5f","Type":"ContainerStarted","Data":"fa9a32a9e07d0353d52b95359c2455c434ec84c18d93a38d86c85c8d0ff031f7"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.096705 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" event={"ID":"7dd65530-964d-4b59-a201-023f968c59f8","Type":"ContainerStarted","Data":"31166d72c66257d8f5c35acd9cc95482d9332ef9bfd83511ae8248aa4353d7c1"} Jan 22 14:20:24 crc kubenswrapper[4801]: E0122 14:20:24.099282 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" podUID="7dd65530-964d-4b59-a201-023f968c59f8" Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.101156 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" event={"ID":"1fed9af5-123f-4db7-84d2-192d22c9024f","Type":"ContainerStarted","Data":"b03cd61687c58f5c6330abb2e3c280ac730738a59309e9babea05815c30d0d93"} Jan 22 14:20:24 crc kubenswrapper[4801]: I0122 14:20:24.769626 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:24 crc kubenswrapper[4801]: E0122 14:20:24.769803 4801 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:24 crc kubenswrapper[4801]: E0122 14:20:24.769849 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert podName:046a1aec-268f-4b7e-9644-572026853eaa nodeName:}" failed. No retries permitted until 2026-01-22 14:20:28.76983533 +0000 UTC m=+977.471735513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert") pod "infra-operator-controller-manager-54ccf4f85d-ffdkv" (UID: "046a1aec-268f-4b7e-9644-572026853eaa") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.109939 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" podUID="7dd65530-964d-4b59-a201-023f968c59f8" Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.110514 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" podUID="dd6ed712-5542-4847-af9b-a536b29000b2" Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.110556 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:2d6d13b3c28e45c6bec980b8808dda8da4723ae87e66d04f53d52c3b3c51612b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" podUID="e3932e04-5349-4181-8019-f651145fa996" Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.110594 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" podUID="00d655ad-5646-42b8-baf8-f02ee28ab4ac" Jan 22 14:20:25 crc kubenswrapper[4801]: I0122 14:20:25.146582 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.146723 4801 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.146769 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert podName:da0d6842-c349-4dfb-8b0d-777cabdc8941 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:29.146755649 +0000 UTC m=+977.848655833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" (UID: "da0d6842-c349-4dfb-8b0d-777cabdc8941") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:25 crc kubenswrapper[4801]: I0122 14:20:25.658210 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:25 crc kubenswrapper[4801]: I0122 14:20:25.658655 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.659190 4801 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.659271 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:29.659253167 +0000 UTC m=+978.361153350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "webhook-server-cert" not found Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.660625 4801 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:20:25 crc kubenswrapper[4801]: E0122 14:20:25.661916 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:29.661891143 +0000 UTC m=+978.363791396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "metrics-server-cert" not found Jan 22 14:20:28 crc kubenswrapper[4801]: I0122 14:20:28.802214 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:28 crc kubenswrapper[4801]: E0122 14:20:28.802744 4801 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:28 crc kubenswrapper[4801]: E0122 14:20:28.802799 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert podName:046a1aec-268f-4b7e-9644-572026853eaa nodeName:}" failed. No retries permitted until 2026-01-22 14:20:36.802781622 +0000 UTC m=+985.504681815 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert") pod "infra-operator-controller-manager-54ccf4f85d-ffdkv" (UID: "046a1aec-268f-4b7e-9644-572026853eaa") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:29 crc kubenswrapper[4801]: I0122 14:20:29.209331 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:29 crc kubenswrapper[4801]: E0122 14:20:29.209539 4801 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:29 crc kubenswrapper[4801]: E0122 14:20:29.209622 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert podName:da0d6842-c349-4dfb-8b0d-777cabdc8941 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:37.209601481 +0000 UTC m=+985.911501664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" (UID: "da0d6842-c349-4dfb-8b0d-777cabdc8941") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:29 crc kubenswrapper[4801]: I0122 14:20:29.715402 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:29 crc kubenswrapper[4801]: E0122 14:20:29.715563 4801 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:20:29 crc kubenswrapper[4801]: E0122 14:20:29.716533 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:37.716226123 +0000 UTC m=+986.418126306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "webhook-server-cert" not found Jan 22 14:20:29 crc kubenswrapper[4801]: I0122 14:20:29.716580 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:29 crc kubenswrapper[4801]: E0122 14:20:29.716761 4801 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:20:29 crc kubenswrapper[4801]: E0122 14:20:29.716807 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:37.716793379 +0000 UTC m=+986.418693632 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "metrics-server-cert" not found Jan 22 14:20:36 crc kubenswrapper[4801]: E0122 14:20:36.198523 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:d3c55b59cb192799f8d31196c55c9e9bb3cd38aef7ec51ef257dabf1548e8b30" Jan 22 14:20:36 crc kubenswrapper[4801]: E0122 14:20:36.199248 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:d3c55b59cb192799f8d31196c55c9e9bb3cd38aef7ec51ef257dabf1548e8b30,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jgrtg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-69d6c9f5b8-v4msq_openstack-operators(8bd2b2a1-52ee-41f6-ada0-652b74c4542b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:36 crc kubenswrapper[4801]: E0122 14:20:36.200351 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" podUID="8bd2b2a1-52ee-41f6-ada0-652b74c4542b" Jan 22 14:20:36 crc kubenswrapper[4801]: E0122 14:20:36.315516 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d3c55b59cb192799f8d31196c55c9e9bb3cd38aef7ec51ef257dabf1548e8b30\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" podUID="8bd2b2a1-52ee-41f6-ada0-652b74c4542b" Jan 22 14:20:36 crc kubenswrapper[4801]: I0122 14:20:36.881352 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:36 crc kubenswrapper[4801]: E0122 14:20:36.881897 4801 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:36 crc kubenswrapper[4801]: E0122 14:20:36.881945 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert podName:046a1aec-268f-4b7e-9644-572026853eaa nodeName:}" failed. No retries permitted until 2026-01-22 14:20:52.88193101 +0000 UTC m=+1001.583831193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert") pod "infra-operator-controller-manager-54ccf4f85d-ffdkv" (UID: "046a1aec-268f-4b7e-9644-572026853eaa") : secret "infra-operator-webhook-server-cert" not found Jan 22 14:20:37 crc kubenswrapper[4801]: I0122 14:20:37.306543 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.306745 4801 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.306823 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert podName:da0d6842-c349-4dfb-8b0d-777cabdc8941 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:53.306801805 +0000 UTC m=+1002.008701988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" (UID: "da0d6842-c349-4dfb-8b0d-777cabdc8941") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.671774 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8bee4480babd6fd8f686e0ba52a304acb6ffb90f09c7c57e7f5df5f7658836d8" Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.672066 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8bee4480babd6fd8f686e0ba52a304acb6ffb90f09c7c57e7f5df5f7658836d8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rqtdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-78c6999f6f-rlqgc_openstack-operators(d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.673318 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" podUID="d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43" Jan 22 14:20:37 crc kubenswrapper[4801]: I0122 14:20:37.815147 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:37 crc kubenswrapper[4801]: I0122 14:20:37.815240 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.815388 4801 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.815510 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:53.815486545 +0000 UTC m=+1002.517386788 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "webhook-server-cert" not found Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.815397 4801 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 22 14:20:37 crc kubenswrapper[4801]: E0122 14:20:37.815954 4801 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs podName:cb47af42-6498-44e4-8a55-04e5fff23296 nodeName:}" failed. No retries permitted until 2026-01-22 14:20:53.815904147 +0000 UTC m=+1002.517804330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs") pod "openstack-operator-controller-manager-6dbd46c5ff-nxzjt" (UID: "cb47af42-6498-44e4-8a55-04e5fff23296") : secret "metrics-server-cert" not found Jan 22 14:20:38 crc kubenswrapper[4801]: E0122 14:20:38.309707 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8bee4480babd6fd8f686e0ba52a304acb6ffb90f09c7c57e7f5df5f7658836d8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" podUID="d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43" Jan 22 14:20:45 crc kubenswrapper[4801]: E0122 14:20:45.167494 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 22 14:20:45 crc kubenswrapper[4801]: E0122 14:20:45.168667 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-48lb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-fphcj_openstack-operators(1fed9af5-123f-4db7-84d2-192d22c9024f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:45 crc kubenswrapper[4801]: E0122 14:20:45.169937 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" podUID="1fed9af5-123f-4db7-84d2-192d22c9024f" Jan 22 14:20:45 crc kubenswrapper[4801]: E0122 14:20:45.371816 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" podUID="1fed9af5-123f-4db7-84d2-192d22c9024f" Jan 22 14:20:46 crc kubenswrapper[4801]: E0122 14:20:46.216536 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:b57d65d2a968705b9067192a7cb33bd4a12489db87e1d05de78c076f2062cab4" Jan 22 14:20:46 crc kubenswrapper[4801]: E0122 14:20:46.217371 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:b57d65d2a968705b9067192a7cb33bd4a12489db87e1d05de78c076f2062cab4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bmtsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5d8f59fb49-cthv6_openstack-operators(def6ced9-f64a-428e-ae30-6b1f97648c5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:46 crc kubenswrapper[4801]: E0122 14:20:46.218835 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" podUID="def6ced9-f64a-428e-ae30-6b1f97648c5f" Jan 22 14:20:46 crc kubenswrapper[4801]: E0122 14:20:46.380471 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:b57d65d2a968705b9067192a7cb33bd4a12489db87e1d05de78c076f2062cab4\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" podUID="def6ced9-f64a-428e-ae30-6b1f97648c5f" Jan 22 14:20:47 crc kubenswrapper[4801]: E0122 14:20:47.376035 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:9caae9b3ee328df678baa26454e45e47693acdadb27f9c635680597aaec43337" Jan 22 14:20:47 crc kubenswrapper[4801]: E0122 14:20:47.376212 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:9caae9b3ee328df678baa26454e45e47693acdadb27f9c635680597aaec43337,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bgdv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-78fdd796fd-9lj8x_openstack-operators(31afca8d-2c0d-419c-ba85-de05d929490b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:47 crc kubenswrapper[4801]: E0122 14:20:47.377657 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" podUID="31afca8d-2c0d-419c-ba85-de05d929490b" Jan 22 14:20:47 crc kubenswrapper[4801]: E0122 14:20:47.398139 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:9caae9b3ee328df678baa26454e45e47693acdadb27f9c635680597aaec43337\\\"\"" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" podUID="31afca8d-2c0d-419c-ba85-de05d929490b" Jan 22 14:20:48 crc kubenswrapper[4801]: E0122 14:20:48.860409 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:65cfe5b9d5b0571aaf8ff9840b12cc56e90ca4cef162dd260c3a9fa2b52c6dd0" Jan 22 14:20:48 crc kubenswrapper[4801]: E0122 14:20:48.860622 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:65cfe5b9d5b0571aaf8ff9840b12cc56e90ca4cef162dd260c3a9fa2b52c6dd0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dwsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5d646b7d76-tw2h5_openstack-operators(697b4f23-f1b2-4c36-b453-38bff993b462): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:48 crc kubenswrapper[4801]: E0122 14:20:48.861874 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" podUID="697b4f23-f1b2-4c36-b453-38bff993b462" Jan 22 14:20:49 crc kubenswrapper[4801]: E0122 14:20:49.455192 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:65cfe5b9d5b0571aaf8ff9840b12cc56e90ca4cef162dd260c3a9fa2b52c6dd0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" podUID="697b4f23-f1b2-4c36-b453-38bff993b462" Jan 22 14:20:49 crc kubenswrapper[4801]: E0122 14:20:49.717567 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127" Jan 22 14:20:49 crc kubenswrapper[4801]: E0122 14:20:49.717840 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dgnxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-85cd9769bb-h6dmw_openstack-operators(5f1bd071-a488-4023-a2fd-7e8a8b0c998a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:49 crc kubenswrapper[4801]: E0122 14:20:49.719078 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" podUID="5f1bd071-a488-4023-a2fd-7e8a8b0c998a" Jan 22 14:20:50 crc kubenswrapper[4801]: E0122 14:20:50.453834 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:e02722d7581bfe1c5fc13e2fa6811d8665102ba86635c77547abf6b933cde127\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" podUID="5f1bd071-a488-4023-a2fd-7e8a8b0c998a" Jan 22 14:20:50 crc kubenswrapper[4801]: E0122 14:20:50.457268 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d" Jan 22 14:20:50 crc kubenswrapper[4801]: E0122 14:20:50.457507 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6h265,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-69797bbcbd-tvbhx_openstack-operators(bd9524a3-7561-4e18-84f7-44bd95a5475c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:50 crc kubenswrapper[4801]: E0122 14:20:50.458693 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" podUID="bd9524a3-7561-4e18-84f7-44bd95a5475c" Jan 22 14:20:51 crc kubenswrapper[4801]: E0122 14:20:51.358582 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf" Jan 22 14:20:51 crc kubenswrapper[4801]: E0122 14:20:51.358966 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pkljh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-xlsls_openstack-operators(552434b1-06fc-4f26-972f-e53d29f623e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:51 crc kubenswrapper[4801]: E0122 14:20:51.360340 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" podUID="552434b1-06fc-4f26-972f-e53d29f623e9" Jan 22 14:20:51 crc kubenswrapper[4801]: E0122 14:20:51.457614 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" podUID="552434b1-06fc-4f26-972f-e53d29f623e9" Jan 22 14:20:51 crc kubenswrapper[4801]: E0122 14:20:51.458247 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" podUID="bd9524a3-7561-4e18-84f7-44bd95a5475c" Jan 22 14:20:52 crc kubenswrapper[4801]: I0122 14:20:52.896702 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:52 crc kubenswrapper[4801]: I0122 14:20:52.902613 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/046a1aec-268f-4b7e-9644-572026853eaa-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-ffdkv\" (UID: \"046a1aec-268f-4b7e-9644-572026853eaa\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.012473 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.404156 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.407966 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da0d6842-c349-4dfb-8b0d-777cabdc8941-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk\" (UID: \"da0d6842-c349-4dfb-8b0d-777cabdc8941\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.559776 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.910805 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.910917 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.919513 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-metrics-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:53 crc kubenswrapper[4801]: I0122 14:20:53.920143 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cb47af42-6498-44e4-8a55-04e5fff23296-webhook-certs\") pod \"openstack-operator-controller-manager-6dbd46c5ff-nxzjt\" (UID: \"cb47af42-6498-44e4-8a55-04e5fff23296\") " pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:54 crc kubenswrapper[4801]: I0122 14:20:54.078623 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:20:54 crc kubenswrapper[4801]: E0122 14:20:54.698275 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:e950ac2df7be78ae0cbcf62fe12ee7a06b628f1903da6fcb741609e857eb1a7f" Jan 22 14:20:54 crc kubenswrapper[4801]: E0122 14:20:54.698560 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:e950ac2df7be78ae0cbcf62fe12ee7a06b628f1903da6fcb741609e857eb1a7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbldr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-69cf5d4557-jjdzm_openstack-operators(f9e65da4-cf22-44e4-8e0e-f33aee5413ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:54 crc kubenswrapper[4801]: E0122 14:20:54.700033 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" podUID="f9e65da4-cf22-44e4-8e0e-f33aee5413ba" Jan 22 14:20:55 crc kubenswrapper[4801]: E0122 14:20:55.648269 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:e950ac2df7be78ae0cbcf62fe12ee7a06b628f1903da6fcb741609e857eb1a7f\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" podUID="f9e65da4-cf22-44e4-8e0e-f33aee5413ba" Jan 22 14:20:56 crc kubenswrapper[4801]: E0122 14:20:56.823808 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:4e995cfa360a9d595a01b9c0541ab934692f2374203cb5738127dd784f793831" Jan 22 14:20:56 crc kubenswrapper[4801]: E0122 14:20:56.824724 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:4e995cfa360a9d595a01b9c0541ab934692f2374203cb5738127dd784f793831,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rqgdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-6b8bc8d87d-ss9zg_openstack-operators(c5890193-a2af-432d-b66b-62f480c03768): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:56 crc kubenswrapper[4801]: E0122 14:20:56.826710 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" podUID="c5890193-a2af-432d-b66b-62f480c03768" Jan 22 14:20:57 crc kubenswrapper[4801]: E0122 14:20:57.659555 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:4e995cfa360a9d595a01b9c0541ab934692f2374203cb5738127dd784f793831\\\"\"" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" podUID="c5890193-a2af-432d-b66b-62f480c03768" Jan 22 14:20:57 crc kubenswrapper[4801]: E0122 14:20:57.889850 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922" Jan 22 14:20:57 crc kubenswrapper[4801]: E0122 14:20:57.890034 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7vdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-547cbdb99f-t8r9r_openstack-operators(7dd65530-964d-4b59-a201-023f968c59f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:20:57 crc kubenswrapper[4801]: E0122 14:20:57.891756 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" podUID="7dd65530-964d-4b59-a201-023f968c59f8" Jan 22 14:21:00 crc kubenswrapper[4801]: E0122 14:21:00.210038 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5" Jan 22 14:21:00 crc kubenswrapper[4801]: E0122 14:21:00.210510 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2gqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7bd9774b6-d4lmf_openstack-operators(dd6ed712-5542-4847-af9b-a536b29000b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:21:00 crc kubenswrapper[4801]: E0122 14:21:00.211661 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" podUID="dd6ed712-5542-4847-af9b-a536b29000b2" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.136005 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.136198 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8jsnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b8b6d4659-dzf5f_openstack-operators(bec5ba53-958e-48be-af46-cf073df0d161): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.137366 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" podUID="bec5ba53-958e-48be-af46-cf073df0d161" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.682007 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" podUID="bec5ba53-958e-48be-af46-cf073df0d161" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.777803 4801 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.778034 4801 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7v5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-x7wxr_openstack-operators(00d655ad-5646-42b8-baf8-f02ee28ab4ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 14:21:01 crc kubenswrapper[4801]: E0122 14:21:01.779382 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" podUID="00d655ad-5646-42b8-baf8-f02ee28ab4ac" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.334987 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk"] Jan 22 14:21:02 crc kubenswrapper[4801]: W0122 14:21:02.393698 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0d6842_c349_4dfb_8b0d_777cabdc8941.slice/crio-355218cefdd8e30bf8e6d64d82290f7de24d4e7451dd8ff5976b23122d86d8f0 WatchSource:0}: Error finding container 355218cefdd8e30bf8e6d64d82290f7de24d4e7451dd8ff5976b23122d86d8f0: Status 404 returned error can't find the container with id 355218cefdd8e30bf8e6d64d82290f7de24d4e7451dd8ff5976b23122d86d8f0 Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.425207 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt"] Jan 22 14:21:02 crc kubenswrapper[4801]: W0122 14:21:02.436732 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb47af42_6498_44e4_8a55_04e5fff23296.slice/crio-4819ef36f5b120913f64ee5654f77fb0fb6c3a03a059b314c863b9fe45d4da45 WatchSource:0}: Error finding container 4819ef36f5b120913f64ee5654f77fb0fb6c3a03a059b314c863b9fe45d4da45: Status 404 returned error can't find the container with id 4819ef36f5b120913f64ee5654f77fb0fb6c3a03a059b314c863b9fe45d4da45 Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.461374 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv"] Jan 22 14:21:02 crc kubenswrapper[4801]: W0122 14:21:02.479253 4801 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046a1aec_268f_4b7e_9644_572026853eaa.slice/crio-47e7a0809ccf8b4033eca46838f68460bdd6f22e9c4470dc10c0189f4c9cc496 WatchSource:0}: Error finding container 47e7a0809ccf8b4033eca46838f68460bdd6f22e9c4470dc10c0189f4c9cc496: Status 404 returned error can't find the container with id 47e7a0809ccf8b4033eca46838f68460bdd6f22e9c4470dc10c0189f4c9cc496 Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.685928 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" event={"ID":"1fed9af5-123f-4db7-84d2-192d22c9024f","Type":"ContainerStarted","Data":"a09bc6a294cacd065979f196a8c4f7117b33bcbdc9952b3d306010cef7f6a203"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.687036 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.692663 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" event={"ID":"bd9524a3-7561-4e18-84f7-44bd95a5475c","Type":"ContainerStarted","Data":"f10bfeb3c75257c2f42e7c49d3c632710d37ca135dbbb06444b3c0c40ce51af7"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.692926 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.695359 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" event={"ID":"da0d6842-c349-4dfb-8b0d-777cabdc8941","Type":"ContainerStarted","Data":"355218cefdd8e30bf8e6d64d82290f7de24d4e7451dd8ff5976b23122d86d8f0"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.697635 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" event={"ID":"8bd2b2a1-52ee-41f6-ada0-652b74c4542b","Type":"ContainerStarted","Data":"7abf2ee751d212fbc96e6c6d186742a255570dcbc2be2b493fb9a844f5991604"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.698311 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.700762 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" event={"ID":"e1799966-64ec-4e45-b7a5-45715376926c","Type":"ContainerStarted","Data":"90d7fc5ccfb39d40bd43ebce6322c36ff4529bdd0e16adda5b43fbc02f8d5502"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.701184 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.703346 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" event={"ID":"cb47af42-6498-44e4-8a55-04e5fff23296","Type":"ContainerStarted","Data":"6be3060d1b547765bdef58eb7270946f25924150fc55d9c81ecae0dbbe4869bc"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.703368 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" event={"ID":"cb47af42-6498-44e4-8a55-04e5fff23296","Type":"ContainerStarted","Data":"4819ef36f5b120913f64ee5654f77fb0fb6c3a03a059b314c863b9fe45d4da45"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.703793 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.705726 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" event={"ID":"d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43","Type":"ContainerStarted","Data":"b19827f8ea5627ffb212abac31b719ea45bbf884ccf1c7393ab18f1b88f8187a"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.706104 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.708057 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" event={"ID":"e05677d8-9dbb-489c-9eb8-84ff6981776c","Type":"ContainerStarted","Data":"5575dca969cadc9e6322b554eca8da874298913dde78397ae28eb93cd12d96ff"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.708478 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.711344 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" event={"ID":"31afca8d-2c0d-419c-ba85-de05d929490b","Type":"ContainerStarted","Data":"27dbb9f92b7ce7c52d7bbe2d88dec53a038dc1c3c203ea768c2062cd1f1bc68e"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.712016 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.712212 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" event={"ID":"046a1aec-268f-4b7e-9644-572026853eaa","Type":"ContainerStarted","Data":"47e7a0809ccf8b4033eca46838f68460bdd6f22e9c4470dc10c0189f4c9cc496"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.713535 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" event={"ID":"e3932e04-5349-4181-8019-f651145fa996","Type":"ContainerStarted","Data":"8ce6bd55e159de7ee74ed36da9db252ceb07f4e2733c622530c68ec0a83f59ae"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.713736 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.714670 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" event={"ID":"def6ced9-f64a-428e-ae30-6b1f97648c5f","Type":"ContainerStarted","Data":"aaa9a4b287532b677524442050128e7ee96a1e9e6e0af2fa8802532b49c8cba0"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.714843 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.716226 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" event={"ID":"3d1aad44-ff2f-4309-9984-7da6200820ac","Type":"ContainerStarted","Data":"97c876da342232ed61063cf1d947b2ec7d0c9e58ce44cf7e5f288eafb4b96b6c"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.716334 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.717494 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" event={"ID":"62197d9f-468d-4a38-8b24-b4639ec0b1c0","Type":"ContainerStarted","Data":"7bc48f34e3c773b3e4f5145405ff0b0f4bd4663cf76c67cbb31906885bee5aee"} Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.717717 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.775112 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" podStartSLOduration=3.793464417 podStartE2EDuration="42.775092631s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.069796107 +0000 UTC m=+971.771696290" lastFinishedPulling="2026-01-22 14:21:02.051424321 +0000 UTC m=+1010.753324504" observedRunningTime="2026-01-22 14:21:02.771616492 +0000 UTC m=+1011.473516685" watchObservedRunningTime="2026-01-22 14:21:02.775092631 +0000 UTC m=+1011.476992814" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.788387 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" podStartSLOduration=5.239328514 podStartE2EDuration="42.788370939s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.654817344 +0000 UTC m=+971.356717527" lastFinishedPulling="2026-01-22 14:21:00.203859769 +0000 UTC m=+1008.905759952" observedRunningTime="2026-01-22 14:21:02.787778362 +0000 UTC m=+1011.489678545" watchObservedRunningTime="2026-01-22 14:21:02.788370939 +0000 UTC m=+1011.490271122" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.837437 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" podStartSLOduration=4.297049042 podStartE2EDuration="42.837418915s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.510994866 +0000 UTC m=+972.212895049" lastFinishedPulling="2026-01-22 14:21:02.051364739 +0000 UTC m=+1010.753264922" observedRunningTime="2026-01-22 14:21:02.836586381 +0000 UTC m=+1011.538486574" watchObservedRunningTime="2026-01-22 14:21:02.837418915 +0000 UTC m=+1011.539319098" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.840432 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" podStartSLOduration=3.809691269 podStartE2EDuration="42.840422551s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.869819404 +0000 UTC m=+971.571719587" lastFinishedPulling="2026-01-22 14:21:01.900550686 +0000 UTC m=+1010.602450869" observedRunningTime="2026-01-22 14:21:02.815848651 +0000 UTC m=+1011.517748834" watchObservedRunningTime="2026-01-22 14:21:02.840422551 +0000 UTC m=+1011.542322734" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.882060 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" podStartSLOduration=5.543124912 podStartE2EDuration="42.882042075s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.864899434 +0000 UTC m=+971.566799617" lastFinishedPulling="2026-01-22 14:21:00.203816597 +0000 UTC m=+1008.905716780" observedRunningTime="2026-01-22 14:21:02.856423596 +0000 UTC m=+1011.558323789" watchObservedRunningTime="2026-01-22 14:21:02.882042075 +0000 UTC m=+1011.583942318" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.890788 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" podStartSLOduration=5.323449249 podStartE2EDuration="42.890765434s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.636464041 +0000 UTC m=+971.338364224" lastFinishedPulling="2026-01-22 14:21:00.203780216 +0000 UTC m=+1008.905680409" observedRunningTime="2026-01-22 14:21:02.881576882 +0000 UTC m=+1011.583477065" watchObservedRunningTime="2026-01-22 14:21:02.890765434 +0000 UTC m=+1011.592665637" Jan 22 14:21:02 crc kubenswrapper[4801]: I0122 14:21:02.958337 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" podStartSLOduration=3.9643984530000003 podStartE2EDuration="42.958323147s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.069679823 +0000 UTC m=+971.771579996" lastFinishedPulling="2026-01-22 14:21:02.063604517 +0000 UTC m=+1010.765504690" observedRunningTime="2026-01-22 14:21:02.957219995 +0000 UTC m=+1011.659120198" watchObservedRunningTime="2026-01-22 14:21:02.958323147 +0000 UTC m=+1011.660223330" Jan 22 14:21:03 crc kubenswrapper[4801]: I0122 14:21:03.030937 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" podStartSLOduration=7.895644877 podStartE2EDuration="43.030921213s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.153617866 +0000 UTC m=+970.855518049" lastFinishedPulling="2026-01-22 14:20:57.288894212 +0000 UTC m=+1005.990794385" observedRunningTime="2026-01-22 14:21:03.030763939 +0000 UTC m=+1011.732664122" watchObservedRunningTime="2026-01-22 14:21:03.030921213 +0000 UTC m=+1011.732821396" Jan 22 14:21:03 crc kubenswrapper[4801]: I0122 14:21:03.054558 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" podStartSLOduration=3.740113585 podStartE2EDuration="42.054532286s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.586105504 +0000 UTC m=+972.288005687" lastFinishedPulling="2026-01-22 14:21:01.900524205 +0000 UTC m=+1010.602424388" observedRunningTime="2026-01-22 14:21:03.047411653 +0000 UTC m=+1011.749311836" watchObservedRunningTime="2026-01-22 14:21:03.054532286 +0000 UTC m=+1011.756432469" Jan 22 14:21:03 crc kubenswrapper[4801]: I0122 14:21:03.067031 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" podStartSLOduration=4.229989323 podStartE2EDuration="43.067010431s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.063412315 +0000 UTC m=+971.765312498" lastFinishedPulling="2026-01-22 14:21:01.900433423 +0000 UTC m=+1010.602333606" observedRunningTime="2026-01-22 14:21:03.066369473 +0000 UTC m=+1011.768269686" watchObservedRunningTime="2026-01-22 14:21:03.067010431 +0000 UTC m=+1011.768910614" Jan 22 14:21:03 crc kubenswrapper[4801]: I0122 14:21:03.148826 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" podStartSLOduration=3.354428345 podStartE2EDuration="42.148809539s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.509004329 +0000 UTC m=+972.210904512" lastFinishedPulling="2026-01-22 14:21:02.303385523 +0000 UTC m=+1011.005285706" observedRunningTime="2026-01-22 14:21:03.105248299 +0000 UTC m=+1011.807148492" watchObservedRunningTime="2026-01-22 14:21:03.148809539 +0000 UTC m=+1011.850709722" Jan 22 14:21:03 crc kubenswrapper[4801]: I0122 14:21:03.153532 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" podStartSLOduration=42.153512993 podStartE2EDuration="42.153512993s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 14:21:03.136290193 +0000 UTC m=+1011.838190376" watchObservedRunningTime="2026-01-22 14:21:03.153512993 +0000 UTC m=+1011.855413176" Jan 22 14:21:03 crc kubenswrapper[4801]: I0122 14:21:03.605521 4801 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:21:05 crc kubenswrapper[4801]: I0122 14:21:05.760951 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" event={"ID":"5f1bd071-a488-4023-a2fd-7e8a8b0c998a","Type":"ContainerStarted","Data":"45443cb33d591502502f8ba7561cb6fbb57b1c70bb02471a42bc485b532c5e88"} Jan 22 14:21:05 crc kubenswrapper[4801]: I0122 14:21:05.761689 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:21:05 crc kubenswrapper[4801]: I0122 14:21:05.763100 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" event={"ID":"697b4f23-f1b2-4c36-b453-38bff993b462","Type":"ContainerStarted","Data":"c093ed484b35af0439e005a23c3fbfd1a96304407692281a9ffbe0447385ccb6"} Jan 22 14:21:05 crc kubenswrapper[4801]: I0122 14:21:05.763377 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:21:05 crc kubenswrapper[4801]: I0122 14:21:05.803529 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" podStartSLOduration=3.540875002 podStartE2EDuration="44.803508047s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.499433997 +0000 UTC m=+972.201334170" lastFinishedPulling="2026-01-22 14:21:04.762067042 +0000 UTC m=+1013.463967215" observedRunningTime="2026-01-22 14:21:05.792827823 +0000 UTC m=+1014.494728016" watchObservedRunningTime="2026-01-22 14:21:05.803508047 +0000 UTC m=+1014.505408230" Jan 22 14:21:05 crc kubenswrapper[4801]: I0122 14:21:05.818599 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" podStartSLOduration=3.5655696839999997 podStartE2EDuration="44.818575356s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.507911868 +0000 UTC m=+972.209812051" lastFinishedPulling="2026-01-22 14:21:04.76091753 +0000 UTC m=+1013.462817723" observedRunningTime="2026-01-22 14:21:05.81483935 +0000 UTC m=+1014.516739543" watchObservedRunningTime="2026-01-22 14:21:05.818575356 +0000 UTC m=+1014.520475539" Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.797434 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" event={"ID":"046a1aec-268f-4b7e-9644-572026853eaa","Type":"ContainerStarted","Data":"f922795ccd91d54fd6ffb325f46aad30c09f812bcdf18a9c7023fe3d5ffd726d"} Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.797706 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.799119 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" event={"ID":"552434b1-06fc-4f26-972f-e53d29f623e9","Type":"ContainerStarted","Data":"fb0607d676895be9b9ad85ccde48f5a079c0de4a15559d8f2e9cd65aea92667b"} Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.799238 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.800930 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" event={"ID":"da0d6842-c349-4dfb-8b0d-777cabdc8941","Type":"ContainerStarted","Data":"32c658555346956f2358dae79d2a4ede64e10bb48b690bf35ce038e74dee7422"} Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.801050 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.814921 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" podStartSLOduration=43.188835588 podStartE2EDuration="48.814899069s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:21:02.481302658 +0000 UTC m=+1011.183202841" lastFinishedPulling="2026-01-22 14:21:08.107366139 +0000 UTC m=+1016.809266322" observedRunningTime="2026-01-22 14:21:08.811327178 +0000 UTC m=+1017.513227371" watchObservedRunningTime="2026-01-22 14:21:08.814899069 +0000 UTC m=+1017.516799252" Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.836051 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" podStartSLOduration=43.125959268 podStartE2EDuration="48.836030801s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:21:02.396583056 +0000 UTC m=+1011.098483239" lastFinishedPulling="2026-01-22 14:21:08.106654549 +0000 UTC m=+1016.808554772" observedRunningTime="2026-01-22 14:21:08.830595746 +0000 UTC m=+1017.532495929" watchObservedRunningTime="2026-01-22 14:21:08.836030801 +0000 UTC m=+1017.537930984" Jan 22 14:21:08 crc kubenswrapper[4801]: I0122 14:21:08.854798 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" podStartSLOduration=3.222175119 podStartE2EDuration="47.854781664s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.477041689 +0000 UTC m=+972.178941872" lastFinishedPulling="2026-01-22 14:21:08.109648194 +0000 UTC m=+1016.811548417" observedRunningTime="2026-01-22 14:21:08.853606901 +0000 UTC m=+1017.555507094" watchObservedRunningTime="2026-01-22 14:21:08.854781664 +0000 UTC m=+1017.556681847" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.388939 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-z9z4g" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.389824 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59dd8b7cbf-mx4m5" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.390847 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-9lj8x" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.391522 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-f2wt7" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.392496 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-g8z77" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.393171 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-69d6c9f5b8-v4msq" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.475875 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-fphcj" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.507115 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-rlqgc" Jan 22 14:21:11 crc kubenswrapper[4801]: I0122 14:21:11.677115 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5d8f59fb49-cthv6" Jan 22 14:21:12 crc kubenswrapper[4801]: I0122 14:21:12.010854 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5d646b7d76-tw2h5" Jan 22 14:21:12 crc kubenswrapper[4801]: I0122 14:21:12.040179 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-85cd9769bb-h6dmw" Jan 22 14:21:12 crc kubenswrapper[4801]: I0122 14:21:12.054428 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-tvbhx" Jan 22 14:21:12 crc kubenswrapper[4801]: I0122 14:21:12.175660 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5ffb9c6597-zmpxx" Jan 22 14:21:12 crc kubenswrapper[4801]: E0122 14:21:12.587328 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:a8fc8f9d445b1232f446119015b226008b07c6a259f5bebc1fcbb39ec310afe5\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" podUID="dd6ed712-5542-4847-af9b-a536b29000b2" Jan 22 14:21:12 crc kubenswrapper[4801]: E0122 14:21:12.587511 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" podUID="00d655ad-5646-42b8-baf8-f02ee28ab4ac" Jan 22 14:21:12 crc kubenswrapper[4801]: I0122 14:21:12.830354 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" event={"ID":"c5890193-a2af-432d-b66b-62f480c03768","Type":"ContainerStarted","Data":"4d2830901a8a3d1a1f930326985a565a3a2f47ed6d02f4011e2af6938f8dc9d4"} Jan 22 14:21:13 crc kubenswrapper[4801]: I0122 14:21:13.018760 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-ffdkv" Jan 22 14:21:13 crc kubenswrapper[4801]: I0122 14:21:13.565979 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk" Jan 22 14:21:13 crc kubenswrapper[4801]: E0122 14:21:13.572242 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" podUID="7dd65530-964d-4b59-a201-023f968c59f8" Jan 22 14:21:13 crc kubenswrapper[4801]: I0122 14:21:13.837472 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" event={"ID":"f9e65da4-cf22-44e4-8e0e-f33aee5413ba","Type":"ContainerStarted","Data":"07bb2053fedc4540517a13b17773d55c31e35211d24bb8e9f7c928089e5db3f4"} Jan 22 14:21:13 crc kubenswrapper[4801]: I0122 14:21:13.838396 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:21:13 crc kubenswrapper[4801]: I0122 14:21:13.855873 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" podStartSLOduration=5.171409061 podStartE2EDuration="53.855855063s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.47249936 +0000 UTC m=+972.174399543" lastFinishedPulling="2026-01-22 14:21:12.156945362 +0000 UTC m=+1020.858845545" observedRunningTime="2026-01-22 14:21:13.850253284 +0000 UTC m=+1022.552153487" watchObservedRunningTime="2026-01-22 14:21:13.855855063 +0000 UTC m=+1022.557755256" Jan 22 14:21:14 crc kubenswrapper[4801]: I0122 14:21:14.085221 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6dbd46c5ff-nxzjt" Jan 22 14:21:14 crc kubenswrapper[4801]: I0122 14:21:14.844502 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" event={"ID":"bec5ba53-958e-48be-af46-cf073df0d161","Type":"ContainerStarted","Data":"01c001612654bef137549bc0b88edf48be4f637a8315fdc76e5a7ff678198481"} Jan 22 14:21:14 crc kubenswrapper[4801]: I0122 14:21:14.844797 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:21:14 crc kubenswrapper[4801]: I0122 14:21:14.845165 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:21:14 crc kubenswrapper[4801]: I0122 14:21:14.891299 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" podStartSLOduration=3.178225573 podStartE2EDuration="54.891282058s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.410286313 +0000 UTC m=+971.112186496" lastFinishedPulling="2026-01-22 14:21:14.123342798 +0000 UTC m=+1022.825242981" observedRunningTime="2026-01-22 14:21:14.889344693 +0000 UTC m=+1023.591244886" watchObservedRunningTime="2026-01-22 14:21:14.891282058 +0000 UTC m=+1023.593182241" Jan 22 14:21:14 crc kubenswrapper[4801]: I0122 14:21:14.894825 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" podStartSLOduration=3.9759007889999998 podStartE2EDuration="54.894815938s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:22.423492249 +0000 UTC m=+971.125392432" lastFinishedPulling="2026-01-22 14:21:13.342407378 +0000 UTC m=+1022.044307581" observedRunningTime="2026-01-22 14:21:14.876039894 +0000 UTC m=+1023.577940087" watchObservedRunningTime="2026-01-22 14:21:14.894815938 +0000 UTC m=+1023.596716121" Jan 22 14:21:21 crc kubenswrapper[4801]: I0122 14:21:21.033863 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-69cf5d4557-jjdzm" Jan 22 14:21:21 crc kubenswrapper[4801]: I0122 14:21:21.338798 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-dzf5f" Jan 22 14:21:21 crc kubenswrapper[4801]: I0122 14:21:21.722436 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6b8bc8d87d-ss9zg" Jan 22 14:21:21 crc kubenswrapper[4801]: I0122 14:21:21.868260 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-xlsls" Jan 22 14:21:31 crc kubenswrapper[4801]: I0122 14:21:31.992167 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" event={"ID":"7dd65530-964d-4b59-a201-023f968c59f8","Type":"ContainerStarted","Data":"c86487ce93a9f39c958f37322df735fe741aad56668c922427cd95e7b4b4a81e"} Jan 22 14:21:31 crc kubenswrapper[4801]: I0122 14:21:31.992932 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:21:31 crc kubenswrapper[4801]: I0122 14:21:31.995118 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" event={"ID":"dd6ed712-5542-4847-af9b-a536b29000b2","Type":"ContainerStarted","Data":"b726e1c1571c65af134a7c5e69581aca180e4c090386acd7ec3bd2dd41752ccf"} Jan 22 14:21:31 crc kubenswrapper[4801]: I0122 14:21:31.995310 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:21:31 crc kubenswrapper[4801]: I0122 14:21:31.996726 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" event={"ID":"00d655ad-5646-42b8-baf8-f02ee28ab4ac","Type":"ContainerStarted","Data":"28b94c8b51190ace713ee0ae33120d44f84c7013b36dc49c418126f0227d6479"} Jan 22 14:21:32 crc kubenswrapper[4801]: I0122 14:21:32.007522 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" podStartSLOduration=3.359784582 podStartE2EDuration="1m11.007504523s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.57193291 +0000 UTC m=+972.273833093" lastFinishedPulling="2026-01-22 14:21:31.219652841 +0000 UTC m=+1039.921553034" observedRunningTime="2026-01-22 14:21:32.00527764 +0000 UTC m=+1040.707177833" watchObservedRunningTime="2026-01-22 14:21:32.007504523 +0000 UTC m=+1040.709404706" Jan 22 14:21:32 crc kubenswrapper[4801]: I0122 14:21:32.028401 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" podStartSLOduration=4.655774433 podStartE2EDuration="1m12.028379687s" podCreationTimestamp="2026-01-22 14:20:20 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.543416709 +0000 UTC m=+972.245316892" lastFinishedPulling="2026-01-22 14:21:30.916021963 +0000 UTC m=+1039.617922146" observedRunningTime="2026-01-22 14:21:32.022813728 +0000 UTC m=+1040.724713911" watchObservedRunningTime="2026-01-22 14:21:32.028379687 +0000 UTC m=+1040.730279870" Jan 22 14:21:34 crc kubenswrapper[4801]: I0122 14:21:34.022227 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:21:34 crc kubenswrapper[4801]: I0122 14:21:34.022363 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:21:41 crc kubenswrapper[4801]: I0122 14:21:41.753042 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7bd9774b6-d4lmf" Jan 22 14:21:41 crc kubenswrapper[4801]: I0122 14:21:41.771407 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-x7wxr" podStartSLOduration=13.116357296 podStartE2EDuration="1m20.771390915s" podCreationTimestamp="2026-01-22 14:20:21 +0000 UTC" firstStartedPulling="2026-01-22 14:20:23.563363006 +0000 UTC m=+972.265263189" lastFinishedPulling="2026-01-22 14:21:31.218396625 +0000 UTC m=+1039.920296808" observedRunningTime="2026-01-22 14:21:32.042097757 +0000 UTC m=+1040.743997940" watchObservedRunningTime="2026-01-22 14:21:41.771390915 +0000 UTC m=+1050.473291098" Jan 22 14:21:41 crc kubenswrapper[4801]: I0122 14:21:41.969017 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-t8r9r" Jan 22 14:22:04 crc kubenswrapper[4801]: I0122 14:22:04.021095 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:22:04 crc kubenswrapper[4801]: I0122 14:22:04.021725 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.020554 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.021083 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.021125 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.021693 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"749e24075f70012784f8fbfbfdda757c7090725e8db804765c8486f57c8e62bc"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.021736 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://749e24075f70012784f8fbfbfdda757c7090725e8db804765c8486f57c8e62bc" gracePeriod=600 Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.555922 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="749e24075f70012784f8fbfbfdda757c7090725e8db804765c8486f57c8e62bc" exitCode=0 Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.556067 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"749e24075f70012784f8fbfbfdda757c7090725e8db804765c8486f57c8e62bc"} Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.556355 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"fa22fa20eb83b2dc98db94496f3479520a8cddc6f4d033768d56df149e74807c"} Jan 22 14:22:34 crc kubenswrapper[4801]: I0122 14:22:34.556386 4801 scope.go:117] "RemoveContainer" containerID="61d4928d4fe513fa3ba1966842778daa9adc983474924e63b516aff418f3aa6f" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.602684 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x5bzd/must-gather-7r8rm"] Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.604499 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.606539 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-x5bzd"/"default-dockercfg-r9tb8" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.607404 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x5bzd"/"kube-root-ca.crt" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.608224 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x5bzd"/"openshift-service-ca.crt" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.621203 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x5bzd/must-gather-7r8rm"] Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.708043 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nvwg\" (UniqueName: \"kubernetes.io/projected/4d92917a-09f9-494b-a53b-e0d7b41040bd-kube-api-access-2nvwg\") pod \"must-gather-7r8rm\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.708463 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d92917a-09f9-494b-a53b-e0d7b41040bd-must-gather-output\") pod \"must-gather-7r8rm\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.810242 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d92917a-09f9-494b-a53b-e0d7b41040bd-must-gather-output\") pod \"must-gather-7r8rm\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.810324 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nvwg\" (UniqueName: \"kubernetes.io/projected/4d92917a-09f9-494b-a53b-e0d7b41040bd-kube-api-access-2nvwg\") pod \"must-gather-7r8rm\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.810878 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d92917a-09f9-494b-a53b-e0d7b41040bd-must-gather-output\") pod \"must-gather-7r8rm\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.830741 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nvwg\" (UniqueName: \"kubernetes.io/projected/4d92917a-09f9-494b-a53b-e0d7b41040bd-kube-api-access-2nvwg\") pod \"must-gather-7r8rm\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:37 crc kubenswrapper[4801]: I0122 14:22:37.921887 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:22:38 crc kubenswrapper[4801]: I0122 14:22:38.305944 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x5bzd/must-gather-7r8rm"] Jan 22 14:22:38 crc kubenswrapper[4801]: I0122 14:22:38.588734 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" event={"ID":"4d92917a-09f9-494b-a53b-e0d7b41040bd","Type":"ContainerStarted","Data":"d46c78edac2d60b7584e66655922d0e47c808d9cfd0ec734992a998df58e3b33"} Jan 22 14:22:49 crc kubenswrapper[4801]: I0122 14:22:49.670787 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" event={"ID":"4d92917a-09f9-494b-a53b-e0d7b41040bd","Type":"ContainerStarted","Data":"cfc62b96190e967ffc09f065730b7b7853d51bd6a78ed975066b6c6dd529b095"} Jan 22 14:22:49 crc kubenswrapper[4801]: I0122 14:22:49.671425 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" event={"ID":"4d92917a-09f9-494b-a53b-e0d7b41040bd","Type":"ContainerStarted","Data":"19b1e6c464e875fd0ec65c3fd7ff32603530afbe9b61e673f39b8ffd7556c159"} Jan 22 14:22:49 crc kubenswrapper[4801]: I0122 14:22:49.687649 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" podStartSLOduration=2.570878794 podStartE2EDuration="12.687630925s" podCreationTimestamp="2026-01-22 14:22:37 +0000 UTC" firstStartedPulling="2026-01-22 14:22:38.322129254 +0000 UTC m=+1107.024029437" lastFinishedPulling="2026-01-22 14:22:48.438881385 +0000 UTC m=+1117.140781568" observedRunningTime="2026-01-22 14:22:49.684393383 +0000 UTC m=+1118.386293576" watchObservedRunningTime="2026-01-22 14:22:49.687630925 +0000 UTC m=+1118.389531108" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.164347 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/util/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.340766 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/pull/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.342847 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/pull/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.344366 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/util/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.502822 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/util/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.506308 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/pull/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.546670 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7fc5189f3ed7e80c3354d9d7b6d4cbe8a166419e085aad87ace8a207afmcw6t_9c65ce77-9de5-4b7e-a5ee-668640034790/extract/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.709218 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59dd8b7cbf-mx4m5_62197d9f-468d-4a38-8b24-b4639ec0b1c0/manager/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.714056 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-69cf5d4557-jjdzm_f9e65da4-cf22-44e4-8e0e-f33aee5413ba/manager/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.913253 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-9lj8x_31afca8d-2c0d-419c-ba85-de05d929490b/manager/0.log" Jan 22 14:23:51 crc kubenswrapper[4801]: I0122 14:23:51.918801 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-z9z4g_3d1aad44-ff2f-4309-9984-7da6200820ac/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.085060 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-f2wt7_e05677d8-9dbb-489c-9eb8-84ff6981776c/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.098692 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-g8z77_e1799966-64ec-4e45-b7a5-45715376926c/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.262911 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54ccf4f85d-ffdkv_046a1aec-268f-4b7e-9644-572026853eaa/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.276757 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-69d6c9f5b8-v4msq_8bd2b2a1-52ee-41f6-ada0-652b74c4542b/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.443566 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-dzf5f_bec5ba53-958e-48be-af46-cf073df0d161/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.470353 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-rlqgc_d64f87cc-2d65-4b7e-a8c4-4fcbbf57ad43/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.613955 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-fphcj_1fed9af5-123f-4db7-84d2-192d22c9024f/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.653011 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5d8f59fb49-cthv6_def6ced9-f64a-428e-ae30-6b1f97648c5f/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.802969 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6b8bc8d87d-ss9zg_c5890193-a2af-432d-b66b-62f480c03768/manager/0.log" Jan 22 14:23:52 crc kubenswrapper[4801]: I0122 14:23:52.860907 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7bd9774b6-d4lmf_dd6ed712-5542-4847-af9b-a536b29000b2/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.000936 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854zqtbk_da0d6842-c349-4dfb-8b0d-777cabdc8941/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.207088 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-9ff957fff-prp2t_0fa19eb8-dac2-48b6-b5f3-71ce817b8d33/operator/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.217230 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6dbd46c5ff-nxzjt_cb47af42-6498-44e4-8a55-04e5fff23296/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.362274 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xrv9d_1f553167-d9c0-4808-b942-348a04597e38/registry-server/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.395814 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-xlsls_552434b1-06fc-4f26-972f-e53d29f623e9/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.552235 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5d646b7d76-tw2h5_697b4f23-f1b2-4c36-b453-38bff993b462/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.622623 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-x7wxr_00d655ad-5646-42b8-baf8-f02ee28ab4ac/operator/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.737938 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-t8r9r_7dd65530-964d-4b59-a201-023f968c59f8/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.800643 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85cd9769bb-h6dmw_5f1bd071-a488-4023-a2fd-7e8a8b0c998a/manager/0.log" Jan 22 14:23:53 crc kubenswrapper[4801]: I0122 14:23:53.940913 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-tvbhx_bd9524a3-7561-4e18-84f7-44bd95a5475c/manager/0.log" Jan 22 14:23:54 crc kubenswrapper[4801]: I0122 14:23:54.014179 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5ffb9c6597-zmpxx_e3932e04-5349-4181-8019-f651145fa996/manager/0.log" Jan 22 14:24:11 crc kubenswrapper[4801]: I0122 14:24:11.757085 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bpwfq_c319403b-3764-4420-8e53-06fb93e21a23/control-plane-machine-set-operator/0.log" Jan 22 14:24:11 crc kubenswrapper[4801]: I0122 14:24:11.926756 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tzbl4_322bcacd-4b58-46cf-b37e-2ffda3f87b24/kube-rbac-proxy/0.log" Jan 22 14:24:11 crc kubenswrapper[4801]: I0122 14:24:11.978762 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tzbl4_322bcacd-4b58-46cf-b37e-2ffda3f87b24/machine-api-operator/0.log" Jan 22 14:24:23 crc kubenswrapper[4801]: I0122 14:24:23.959359 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-9872z_2dbd57ed-5cef-4b6d-a11e-4afaf8b8b9de/cert-manager-controller/0.log" Jan 22 14:24:24 crc kubenswrapper[4801]: I0122 14:24:24.160021 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-4vmgn_405b6c3b-1654-4cf5-a56d-c7670af97153/cert-manager-cainjector/0.log" Jan 22 14:24:24 crc kubenswrapper[4801]: I0122 14:24:24.161878 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-5cn2c_4c56e565-c300-436f-8c6a-a8e8a366a124/cert-manager-webhook/0.log" Jan 22 14:24:34 crc kubenswrapper[4801]: I0122 14:24:34.021261 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:24:34 crc kubenswrapper[4801]: I0122 14:24:34.021917 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:24:36 crc kubenswrapper[4801]: I0122 14:24:36.362178 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-2qkpv_544915b2-6bab-49e2-a670-0bb95ca121e6/nmstate-console-plugin/0.log" Jan 22 14:24:36 crc kubenswrapper[4801]: I0122 14:24:36.539616 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pvj5p_f54897d6-d02f-4aca-8021-25f78df893d3/nmstate-handler/0.log" Jan 22 14:24:36 crc kubenswrapper[4801]: I0122 14:24:36.603127 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-qchm6_77f6b218-668c-4315-9ed5-be81d3c0ca42/kube-rbac-proxy/0.log" Jan 22 14:24:36 crc kubenswrapper[4801]: I0122 14:24:36.880831 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-qchm6_77f6b218-668c-4315-9ed5-be81d3c0ca42/nmstate-metrics/0.log" Jan 22 14:24:37 crc kubenswrapper[4801]: I0122 14:24:37.006366 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-pg4px_c43a5cdd-7ddd-4a2e-b015-02dc0bc8c630/nmstate-operator/0.log" Jan 22 14:24:37 crc kubenswrapper[4801]: I0122 14:24:37.083126 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-rwdj4_0697bb7e-f3c4-4945-ab46-21e0ae796a8d/nmstate-webhook/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.490833 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-fln4v_4804b763-eb34-47b5-958c-dd672ac9a5be/kube-rbac-proxy/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.499950 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-fln4v_4804b763-eb34-47b5-958c-dd672ac9a5be/controller/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.648690 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-frr-files/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.842051 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-reloader/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.899236 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-metrics/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.906155 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-frr-files/0.log" Jan 22 14:25:02 crc kubenswrapper[4801]: I0122 14:25:02.906267 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-reloader/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.039841 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-frr-files/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.059329 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-metrics/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.071186 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-reloader/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.124938 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-metrics/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.291658 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-frr-files/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.314716 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-reloader/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.334154 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/controller/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.355836 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/cp-metrics/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.490911 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/kube-rbac-proxy/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.528531 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/kube-rbac-proxy-frr/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.532241 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/frr-metrics/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.626250 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/frr/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.719781 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dvff6_a92ad7b9-a7fd-49c2-b42e-132fa97b2228/reloader/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.729939 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-s2qlr_2789e45f-da51-4236-a35b-bce9acd65d2c/frr-k8s-webhook-server/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.878609 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-66cd969-xk8wl_02287e96-efd1-40d3-a38c-b3e5eed73386/manager/0.log" Jan 22 14:25:03 crc kubenswrapper[4801]: I0122 14:25:03.888241 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b5b556888-hxr6q_e19728a3-4d6b-40d4-bb06-4f715a4ac345/webhook-server/0.log" Jan 22 14:25:04 crc kubenswrapper[4801]: I0122 14:25:04.021583 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:25:04 crc kubenswrapper[4801]: I0122 14:25:04.021675 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:25:04 crc kubenswrapper[4801]: I0122 14:25:04.071379 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-46zlw_3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df/kube-rbac-proxy/0.log" Jan 22 14:25:04 crc kubenswrapper[4801]: I0122 14:25:04.194115 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-46zlw_3dcb9bca-f8c6-4a31-87cd-41f7b4b3a6df/speaker/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.063872 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/util/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.282303 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/util/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.313086 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/pull/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.370150 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/pull/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.600182 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/pull/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.601502 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/util/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.604304 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a75bkc_6341e1c6-5dc9-4d69-bd26-904c36a57f1b/extract/0.log" Jan 22 14:25:17 crc kubenswrapper[4801]: I0122 14:25:17.881268 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/util/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.036151 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/util/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.047782 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/pull/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.138999 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/pull/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.310137 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/util/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.328332 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/pull/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.333433 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcd7zgj_1a18dfb8-3d0d-4014-97ff-a3d6d77736d4/extract/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.480124 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/util/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.663970 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/pull/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.691354 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/pull/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.785669 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/util/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.888928 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/util/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.920998 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/pull/0.log" Jan 22 14:25:18 crc kubenswrapper[4801]: I0122 14:25:18.967134 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71325zhw_6c1ed5aa-48cb-4d1f-8691-6edc756db955/extract/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.093298 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/extract-utilities/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.289848 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/extract-content/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.332520 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/extract-content/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.338365 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/extract-utilities/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.484250 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/extract-content/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.490423 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/extract-utilities/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.765216 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gbz4b_50b8f0ee-b547-4b8b-9bac-9803fee56dec/registry-server/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.770760 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/extract-utilities/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.932490 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/extract-utilities/0.log" Jan 22 14:25:19 crc kubenswrapper[4801]: I0122 14:25:19.940846 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/extract-content/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.112177 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/extract-content/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.230410 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/extract-utilities/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.277632 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/extract-content/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.473261 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k78hx_1d5c5ddb-bf6a-4b15-8171-7cf71089d411/marketplace-operator/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.497085 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w898g_c97391e4-71b7-4569-b82e-a9cd32d9f439/registry-server/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.575796 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/extract-utilities/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.727308 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/extract-content/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.763809 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/extract-utilities/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.773634 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/extract-content/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.964562 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/extract-utilities/0.log" Jan 22 14:25:20 crc kubenswrapper[4801]: I0122 14:25:20.990927 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/extract-content/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.056002 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnccj_1cbdd90e-f455-4394-9547-a9d49c8f7ffe/registry-server/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.093425 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/extract-utilities/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.328864 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/extract-content/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.360669 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/extract-content/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.365368 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/extract-utilities/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.614010 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/extract-utilities/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.620918 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/extract-content/0.log" Jan 22 14:25:21 crc kubenswrapper[4801]: I0122 14:25:21.804642 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcrjs_caa6ed3a-95d8-4d88-9970-b2545b4c5803/registry-server/0.log" Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.020603 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.021146 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.021198 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.021798 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa22fa20eb83b2dc98db94496f3479520a8cddc6f4d033768d56df149e74807c"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.021918 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://fa22fa20eb83b2dc98db94496f3479520a8cddc6f4d033768d56df149e74807c" gracePeriod=600 Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.394779 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="fa22fa20eb83b2dc98db94496f3479520a8cddc6f4d033768d56df149e74807c" exitCode=0 Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.394823 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"fa22fa20eb83b2dc98db94496f3479520a8cddc6f4d033768d56df149e74807c"} Jan 22 14:25:34 crc kubenswrapper[4801]: I0122 14:25:34.394855 4801 scope.go:117] "RemoveContainer" containerID="749e24075f70012784f8fbfbfdda757c7090725e8db804765c8486f57c8e62bc" Jan 22 14:25:35 crc kubenswrapper[4801]: I0122 14:25:35.429931 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerStarted","Data":"5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df"} Jan 22 14:26:33 crc kubenswrapper[4801]: I0122 14:26:33.904058 4801 generic.go:334] "Generic (PLEG): container finished" podID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerID="19b1e6c464e875fd0ec65c3fd7ff32603530afbe9b61e673f39b8ffd7556c159" exitCode=0 Jan 22 14:26:33 crc kubenswrapper[4801]: I0122 14:26:33.904150 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" event={"ID":"4d92917a-09f9-494b-a53b-e0d7b41040bd","Type":"ContainerDied","Data":"19b1e6c464e875fd0ec65c3fd7ff32603530afbe9b61e673f39b8ffd7556c159"} Jan 22 14:26:33 crc kubenswrapper[4801]: I0122 14:26:33.906065 4801 scope.go:117] "RemoveContainer" containerID="19b1e6c464e875fd0ec65c3fd7ff32603530afbe9b61e673f39b8ffd7556c159" Jan 22 14:26:34 crc kubenswrapper[4801]: I0122 14:26:34.777643 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x5bzd_must-gather-7r8rm_4d92917a-09f9-494b-a53b-e0d7b41040bd/gather/0.log" Jan 22 14:26:41 crc kubenswrapper[4801]: I0122 14:26:41.776549 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x5bzd/must-gather-7r8rm"] Jan 22 14:26:41 crc kubenswrapper[4801]: I0122 14:26:41.777343 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="copy" containerID="cri-o://cfc62b96190e967ffc09f065730b7b7853d51bd6a78ed975066b6c6dd529b095" gracePeriod=2 Jan 22 14:26:41 crc kubenswrapper[4801]: I0122 14:26:41.784504 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x5bzd/must-gather-7r8rm"] Jan 22 14:26:41 crc kubenswrapper[4801]: I0122 14:26:41.967537 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x5bzd_must-gather-7r8rm_4d92917a-09f9-494b-a53b-e0d7b41040bd/copy/0.log" Jan 22 14:26:41 crc kubenswrapper[4801]: I0122 14:26:41.967889 4801 generic.go:334] "Generic (PLEG): container finished" podID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerID="cfc62b96190e967ffc09f065730b7b7853d51bd6a78ed975066b6c6dd529b095" exitCode=143 Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.164974 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x5bzd_must-gather-7r8rm_4d92917a-09f9-494b-a53b-e0d7b41040bd/copy/0.log" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.165613 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.286821 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d92917a-09f9-494b-a53b-e0d7b41040bd-must-gather-output\") pod \"4d92917a-09f9-494b-a53b-e0d7b41040bd\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.287009 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nvwg\" (UniqueName: \"kubernetes.io/projected/4d92917a-09f9-494b-a53b-e0d7b41040bd-kube-api-access-2nvwg\") pod \"4d92917a-09f9-494b-a53b-e0d7b41040bd\" (UID: \"4d92917a-09f9-494b-a53b-e0d7b41040bd\") " Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.294681 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d92917a-09f9-494b-a53b-e0d7b41040bd-kube-api-access-2nvwg" (OuterVolumeSpecName: "kube-api-access-2nvwg") pod "4d92917a-09f9-494b-a53b-e0d7b41040bd" (UID: "4d92917a-09f9-494b-a53b-e0d7b41040bd"). InnerVolumeSpecName "kube-api-access-2nvwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.371834 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d92917a-09f9-494b-a53b-e0d7b41040bd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4d92917a-09f9-494b-a53b-e0d7b41040bd" (UID: "4d92917a-09f9-494b-a53b-e0d7b41040bd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.389368 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nvwg\" (UniqueName: \"kubernetes.io/projected/4d92917a-09f9-494b-a53b-e0d7b41040bd-kube-api-access-2nvwg\") on node \"crc\" DevicePath \"\"" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.389408 4801 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d92917a-09f9-494b-a53b-e0d7b41040bd-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.975678 4801 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x5bzd_must-gather-7r8rm_4d92917a-09f9-494b-a53b-e0d7b41040bd/copy/0.log" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.976175 4801 scope.go:117] "RemoveContainer" containerID="cfc62b96190e967ffc09f065730b7b7853d51bd6a78ed975066b6c6dd529b095" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.976275 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x5bzd/must-gather-7r8rm" Jan 22 14:26:42 crc kubenswrapper[4801]: I0122 14:26:42.998649 4801 scope.go:117] "RemoveContainer" containerID="19b1e6c464e875fd0ec65c3fd7ff32603530afbe9b61e673f39b8ffd7556c159" Jan 22 14:26:43 crc kubenswrapper[4801]: I0122 14:26:43.581510 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" path="/var/lib/kubelet/pods/4d92917a-09f9-494b-a53b-e0d7b41040bd/volumes" Jan 22 14:27:34 crc kubenswrapper[4801]: I0122 14:27:34.021406 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:27:34 crc kubenswrapper[4801]: I0122 14:27:34.021952 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:28:04 crc kubenswrapper[4801]: I0122 14:28:04.021607 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:28:04 crc kubenswrapper[4801]: I0122 14:28:04.022677 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.797615 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpn28"] Jan 22 14:28:19 crc kubenswrapper[4801]: E0122 14:28:19.798468 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="copy" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.798480 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="copy" Jan 22 14:28:19 crc kubenswrapper[4801]: E0122 14:28:19.798494 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="gather" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.798500 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="gather" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.798641 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="copy" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.798652 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d92917a-09f9-494b-a53b-e0d7b41040bd" containerName="gather" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.799575 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.854238 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpn28"] Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.862127 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-utilities\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.862244 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pkm\" (UniqueName: \"kubernetes.io/projected/ceb091dd-ae92-4116-aa92-c4443cf011b3-kube-api-access-88pkm\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.862406 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-catalog-content\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.963680 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-catalog-content\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.963815 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-utilities\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.963852 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pkm\" (UniqueName: \"kubernetes.io/projected/ceb091dd-ae92-4116-aa92-c4443cf011b3-kube-api-access-88pkm\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.964323 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-catalog-content\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:19 crc kubenswrapper[4801]: I0122 14:28:19.964342 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-utilities\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:20 crc kubenswrapper[4801]: I0122 14:28:20.000756 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pkm\" (UniqueName: \"kubernetes.io/projected/ceb091dd-ae92-4116-aa92-c4443cf011b3-kube-api-access-88pkm\") pod \"redhat-operators-kpn28\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:20 crc kubenswrapper[4801]: I0122 14:28:20.116193 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:20 crc kubenswrapper[4801]: I0122 14:28:20.386782 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpn28"] Jan 22 14:28:21 crc kubenswrapper[4801]: I0122 14:28:21.041994 4801 generic.go:334] "Generic (PLEG): container finished" podID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerID="083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e" exitCode=0 Jan 22 14:28:21 crc kubenswrapper[4801]: I0122 14:28:21.042103 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerDied","Data":"083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e"} Jan 22 14:28:21 crc kubenswrapper[4801]: I0122 14:28:21.042277 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerStarted","Data":"3f1058c37911b487e36564f83d3d7673c62b77ebc77e6bcb9b0bf783af18efcf"} Jan 22 14:28:21 crc kubenswrapper[4801]: I0122 14:28:21.043305 4801 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 14:28:23 crc kubenswrapper[4801]: I0122 14:28:23.060082 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerStarted","Data":"019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83"} Jan 22 14:28:24 crc kubenswrapper[4801]: I0122 14:28:24.071714 4801 generic.go:334] "Generic (PLEG): container finished" podID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerID="019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83" exitCode=0 Jan 22 14:28:24 crc kubenswrapper[4801]: I0122 14:28:24.071827 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerDied","Data":"019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83"} Jan 22 14:28:26 crc kubenswrapper[4801]: I0122 14:28:26.086850 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerStarted","Data":"b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110"} Jan 22 14:28:26 crc kubenswrapper[4801]: I0122 14:28:26.106607 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpn28" podStartSLOduration=3.442017969 podStartE2EDuration="7.106588875s" podCreationTimestamp="2026-01-22 14:28:19 +0000 UTC" firstStartedPulling="2026-01-22 14:28:21.043116791 +0000 UTC m=+1449.745016974" lastFinishedPulling="2026-01-22 14:28:24.707687677 +0000 UTC m=+1453.409587880" observedRunningTime="2026-01-22 14:28:26.1039569 +0000 UTC m=+1454.805857073" watchObservedRunningTime="2026-01-22 14:28:26.106588875 +0000 UTC m=+1454.808489058" Jan 22 14:28:30 crc kubenswrapper[4801]: I0122 14:28:30.116478 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:30 crc kubenswrapper[4801]: I0122 14:28:30.117580 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:31 crc kubenswrapper[4801]: I0122 14:28:31.172294 4801 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpn28" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="registry-server" probeResult="failure" output=< Jan 22 14:28:31 crc kubenswrapper[4801]: timeout: failed to connect service ":50051" within 1s Jan 22 14:28:31 crc kubenswrapper[4801]: > Jan 22 14:28:34 crc kubenswrapper[4801]: I0122 14:28:34.021109 4801 patch_prober.go:28] interesting pod/machine-config-daemon-5t2tp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 14:28:34 crc kubenswrapper[4801]: I0122 14:28:34.021595 4801 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 14:28:34 crc kubenswrapper[4801]: I0122 14:28:34.021643 4801 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" Jan 22 14:28:34 crc kubenswrapper[4801]: I0122 14:28:34.022286 4801 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df"} pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 14:28:34 crc kubenswrapper[4801]: I0122 14:28:34.022349 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" containerName="machine-config-daemon" containerID="cri-o://5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" gracePeriod=600 Jan 22 14:28:36 crc kubenswrapper[4801]: E0122 14:28:36.836472 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:28:37 crc kubenswrapper[4801]: I0122 14:28:37.174148 4801 generic.go:334] "Generic (PLEG): container finished" podID="2b842046-5300-4281-9d73-3ae42f0d56da" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" exitCode=0 Jan 22 14:28:37 crc kubenswrapper[4801]: I0122 14:28:37.174193 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" event={"ID":"2b842046-5300-4281-9d73-3ae42f0d56da","Type":"ContainerDied","Data":"5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df"} Jan 22 14:28:37 crc kubenswrapper[4801]: I0122 14:28:37.174225 4801 scope.go:117] "RemoveContainer" containerID="fa22fa20eb83b2dc98db94496f3479520a8cddc6f4d033768d56df149e74807c" Jan 22 14:28:37 crc kubenswrapper[4801]: I0122 14:28:37.174836 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:28:37 crc kubenswrapper[4801]: E0122 14:28:37.175228 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:28:40 crc kubenswrapper[4801]: I0122 14:28:40.159589 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:40 crc kubenswrapper[4801]: I0122 14:28:40.203530 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:40 crc kubenswrapper[4801]: I0122 14:28:40.399683 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpn28"] Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.210938 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kpn28" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="registry-server" containerID="cri-o://b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110" gracePeriod=2 Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.617415 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.669905 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-catalog-content\") pod \"ceb091dd-ae92-4116-aa92-c4443cf011b3\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.670060 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88pkm\" (UniqueName: \"kubernetes.io/projected/ceb091dd-ae92-4116-aa92-c4443cf011b3-kube-api-access-88pkm\") pod \"ceb091dd-ae92-4116-aa92-c4443cf011b3\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.670111 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-utilities\") pod \"ceb091dd-ae92-4116-aa92-c4443cf011b3\" (UID: \"ceb091dd-ae92-4116-aa92-c4443cf011b3\") " Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.671845 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-utilities" (OuterVolumeSpecName: "utilities") pod "ceb091dd-ae92-4116-aa92-c4443cf011b3" (UID: "ceb091dd-ae92-4116-aa92-c4443cf011b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.676641 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb091dd-ae92-4116-aa92-c4443cf011b3-kube-api-access-88pkm" (OuterVolumeSpecName: "kube-api-access-88pkm") pod "ceb091dd-ae92-4116-aa92-c4443cf011b3" (UID: "ceb091dd-ae92-4116-aa92-c4443cf011b3"). InnerVolumeSpecName "kube-api-access-88pkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.772233 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.772271 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88pkm\" (UniqueName: \"kubernetes.io/projected/ceb091dd-ae92-4116-aa92-c4443cf011b3-kube-api-access-88pkm\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.793081 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceb091dd-ae92-4116-aa92-c4443cf011b3" (UID: "ceb091dd-ae92-4116-aa92-c4443cf011b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:28:41 crc kubenswrapper[4801]: I0122 14:28:41.873723 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb091dd-ae92-4116-aa92-c4443cf011b3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.220241 4801 generic.go:334] "Generic (PLEG): container finished" podID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerID="b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110" exitCode=0 Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.220300 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerDied","Data":"b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110"} Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.220349 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpn28" event={"ID":"ceb091dd-ae92-4116-aa92-c4443cf011b3","Type":"ContainerDied","Data":"3f1058c37911b487e36564f83d3d7673c62b77ebc77e6bcb9b0bf783af18efcf"} Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.220354 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpn28" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.220400 4801 scope.go:117] "RemoveContainer" containerID="b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.242519 4801 scope.go:117] "RemoveContainer" containerID="019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.263603 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpn28"] Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.270052 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kpn28"] Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.288780 4801 scope.go:117] "RemoveContainer" containerID="083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.308143 4801 scope.go:117] "RemoveContainer" containerID="b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110" Jan 22 14:28:42 crc kubenswrapper[4801]: E0122 14:28:42.308523 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110\": container with ID starting with b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110 not found: ID does not exist" containerID="b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.308555 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110"} err="failed to get container status \"b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110\": rpc error: code = NotFound desc = could not find container \"b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110\": container with ID starting with b178200155943a5fd767018d83fc486c67b18a2a2bfd4a635cdb33af4f448110 not found: ID does not exist" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.308577 4801 scope.go:117] "RemoveContainer" containerID="019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83" Jan 22 14:28:42 crc kubenswrapper[4801]: E0122 14:28:42.309020 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83\": container with ID starting with 019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83 not found: ID does not exist" containerID="019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.309045 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83"} err="failed to get container status \"019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83\": rpc error: code = NotFound desc = could not find container \"019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83\": container with ID starting with 019ab09cc9057e85c703bf901ff51f73068bcca35314fb27bcd2b3a514ab3c83 not found: ID does not exist" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.309060 4801 scope.go:117] "RemoveContainer" containerID="083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e" Jan 22 14:28:42 crc kubenswrapper[4801]: E0122 14:28:42.309345 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e\": container with ID starting with 083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e not found: ID does not exist" containerID="083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e" Jan 22 14:28:42 crc kubenswrapper[4801]: I0122 14:28:42.309365 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e"} err="failed to get container status \"083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e\": rpc error: code = NotFound desc = could not find container \"083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e\": container with ID starting with 083df2189cc794eac3029afe7cfb1efa95ceb861d088e9aa8a8be3bef2867e2e not found: ID does not exist" Jan 22 14:28:43 crc kubenswrapper[4801]: I0122 14:28:43.580140 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" path="/var/lib/kubelet/pods/ceb091dd-ae92-4116-aa92-c4443cf011b3/volumes" Jan 22 14:28:48 crc kubenswrapper[4801]: I0122 14:28:48.571510 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:28:48 crc kubenswrapper[4801]: E0122 14:28:48.572160 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:29:02 crc kubenswrapper[4801]: I0122 14:29:02.571601 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:29:02 crc kubenswrapper[4801]: E0122 14:29:02.573049 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:29:14 crc kubenswrapper[4801]: I0122 14:29:14.571702 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:29:14 crc kubenswrapper[4801]: E0122 14:29:14.572418 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:29:20 crc kubenswrapper[4801]: I0122 14:29:20.995659 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2656"] Jan 22 14:29:20 crc kubenswrapper[4801]: E0122 14:29:20.997401 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="extract-utilities" Jan 22 14:29:20 crc kubenswrapper[4801]: I0122 14:29:20.997511 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="extract-utilities" Jan 22 14:29:20 crc kubenswrapper[4801]: E0122 14:29:20.997588 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="registry-server" Jan 22 14:29:20 crc kubenswrapper[4801]: I0122 14:29:20.997648 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="registry-server" Jan 22 14:29:20 crc kubenswrapper[4801]: E0122 14:29:20.997704 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="extract-content" Jan 22 14:29:20 crc kubenswrapper[4801]: I0122 14:29:20.997758 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="extract-content" Jan 22 14:29:20 crc kubenswrapper[4801]: I0122 14:29:20.997948 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb091dd-ae92-4116-aa92-c4443cf011b3" containerName="registry-server" Jan 22 14:29:20 crc kubenswrapper[4801]: I0122 14:29:20.998982 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.011971 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2656"] Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.147852 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-utilities\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.148101 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pqc5\" (UniqueName: \"kubernetes.io/projected/d3064b52-aa97-40f9-b8a5-5493a9e4b911-kube-api-access-6pqc5\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.148162 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-catalog-content\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.249628 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pqc5\" (UniqueName: \"kubernetes.io/projected/d3064b52-aa97-40f9-b8a5-5493a9e4b911-kube-api-access-6pqc5\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.249909 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-catalog-content\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.250095 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-utilities\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.250534 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-catalog-content\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.250567 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-utilities\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.269488 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pqc5\" (UniqueName: \"kubernetes.io/projected/d3064b52-aa97-40f9-b8a5-5493a9e4b911-kube-api-access-6pqc5\") pod \"certified-operators-s2656\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.328478 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:21 crc kubenswrapper[4801]: I0122 14:29:21.860282 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2656"] Jan 22 14:29:22 crc kubenswrapper[4801]: I0122 14:29:22.529058 4801 generic.go:334] "Generic (PLEG): container finished" podID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerID="04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3" exitCode=0 Jan 22 14:29:22 crc kubenswrapper[4801]: I0122 14:29:22.529098 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerDied","Data":"04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3"} Jan 22 14:29:22 crc kubenswrapper[4801]: I0122 14:29:22.529120 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerStarted","Data":"3141d7e04d87395dd7c57df960e91ad462d6e4759d72c80941533774deaa18a9"} Jan 22 14:29:24 crc kubenswrapper[4801]: I0122 14:29:24.542480 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerStarted","Data":"8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f"} Jan 22 14:29:25 crc kubenswrapper[4801]: I0122 14:29:25.551323 4801 generic.go:334] "Generic (PLEG): container finished" podID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerID="8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f" exitCode=0 Jan 22 14:29:25 crc kubenswrapper[4801]: I0122 14:29:25.551375 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerDied","Data":"8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f"} Jan 22 14:29:26 crc kubenswrapper[4801]: I0122 14:29:26.559346 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerStarted","Data":"b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48"} Jan 22 14:29:26 crc kubenswrapper[4801]: I0122 14:29:26.586419 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2656" podStartSLOduration=4.083979385 podStartE2EDuration="6.586399463s" podCreationTimestamp="2026-01-22 14:29:20 +0000 UTC" firstStartedPulling="2026-01-22 14:29:23.538412097 +0000 UTC m=+1512.240312300" lastFinishedPulling="2026-01-22 14:29:26.040832195 +0000 UTC m=+1514.742732378" observedRunningTime="2026-01-22 14:29:26.578709275 +0000 UTC m=+1515.280609468" watchObservedRunningTime="2026-01-22 14:29:26.586399463 +0000 UTC m=+1515.288299656" Jan 22 14:29:27 crc kubenswrapper[4801]: I0122 14:29:27.572201 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:29:27 crc kubenswrapper[4801]: E0122 14:29:27.572497 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:29:31 crc kubenswrapper[4801]: I0122 14:29:31.328724 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:31 crc kubenswrapper[4801]: I0122 14:29:31.330107 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:31 crc kubenswrapper[4801]: I0122 14:29:31.371270 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:31 crc kubenswrapper[4801]: I0122 14:29:31.630596 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:31 crc kubenswrapper[4801]: I0122 14:29:31.687651 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2656"] Jan 22 14:29:33 crc kubenswrapper[4801]: I0122 14:29:33.601283 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2656" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="registry-server" containerID="cri-o://b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48" gracePeriod=2 Jan 22 14:29:33 crc kubenswrapper[4801]: E0122 14:29:33.655264 4801 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3064b52_aa97_40f9_b8a5_5493a9e4b911.slice/crio-b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48.scope\": RecentStats: unable to find data in memory cache]" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.507554 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.620361 4801 generic.go:334] "Generic (PLEG): container finished" podID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerID="b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48" exitCode=0 Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.620420 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerDied","Data":"b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48"} Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.620551 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2656" event={"ID":"d3064b52-aa97-40f9-b8a5-5493a9e4b911","Type":"ContainerDied","Data":"3141d7e04d87395dd7c57df960e91ad462d6e4759d72c80941533774deaa18a9"} Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.620585 4801 scope.go:117] "RemoveContainer" containerID="b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.620675 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2656" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.640110 4801 scope.go:117] "RemoveContainer" containerID="8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.657131 4801 scope.go:117] "RemoveContainer" containerID="04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.671380 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pqc5\" (UniqueName: \"kubernetes.io/projected/d3064b52-aa97-40f9-b8a5-5493a9e4b911-kube-api-access-6pqc5\") pod \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.671558 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-utilities\") pod \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.671663 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-catalog-content\") pod \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\" (UID: \"d3064b52-aa97-40f9-b8a5-5493a9e4b911\") " Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.672781 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-utilities" (OuterVolumeSpecName: "utilities") pod "d3064b52-aa97-40f9-b8a5-5493a9e4b911" (UID: "d3064b52-aa97-40f9-b8a5-5493a9e4b911"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.678525 4801 scope.go:117] "RemoveContainer" containerID="b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.679430 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3064b52-aa97-40f9-b8a5-5493a9e4b911-kube-api-access-6pqc5" (OuterVolumeSpecName: "kube-api-access-6pqc5") pod "d3064b52-aa97-40f9-b8a5-5493a9e4b911" (UID: "d3064b52-aa97-40f9-b8a5-5493a9e4b911"). InnerVolumeSpecName "kube-api-access-6pqc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:29:35 crc kubenswrapper[4801]: E0122 14:29:35.682018 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48\": container with ID starting with b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48 not found: ID does not exist" containerID="b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.682057 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48"} err="failed to get container status \"b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48\": rpc error: code = NotFound desc = could not find container \"b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48\": container with ID starting with b84e970e68fce0e57434c4fe5140d7ad5effe0f38a1c1afb4d924cfa7911bb48 not found: ID does not exist" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.682083 4801 scope.go:117] "RemoveContainer" containerID="8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f" Jan 22 14:29:35 crc kubenswrapper[4801]: E0122 14:29:35.687731 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f\": container with ID starting with 8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f not found: ID does not exist" containerID="8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.687779 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f"} err="failed to get container status \"8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f\": rpc error: code = NotFound desc = could not find container \"8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f\": container with ID starting with 8512b92bdce7e52d165a8a272d40ab00053154cb70258e6cac89c934d4e8509f not found: ID does not exist" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.687805 4801 scope.go:117] "RemoveContainer" containerID="04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3" Jan 22 14:29:35 crc kubenswrapper[4801]: E0122 14:29:35.688399 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3\": container with ID starting with 04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3 not found: ID does not exist" containerID="04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.688479 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3"} err="failed to get container status \"04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3\": rpc error: code = NotFound desc = could not find container \"04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3\": container with ID starting with 04bce298f6042f67e6dd6db902e3f12ded5a0da318c59532d2e30d3754c646e3 not found: ID does not exist" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.731078 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3064b52-aa97-40f9-b8a5-5493a9e4b911" (UID: "d3064b52-aa97-40f9-b8a5-5493a9e4b911"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.773715 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.773797 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3064b52-aa97-40f9-b8a5-5493a9e4b911-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.773826 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pqc5\" (UniqueName: \"kubernetes.io/projected/d3064b52-aa97-40f9-b8a5-5493a9e4b911-kube-api-access-6pqc5\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.990678 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2656"] Jan 22 14:29:35 crc kubenswrapper[4801]: I0122 14:29:35.997056 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2656"] Jan 22 14:29:37 crc kubenswrapper[4801]: I0122 14:29:37.587940 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" path="/var/lib/kubelet/pods/d3064b52-aa97-40f9-b8a5-5493a9e4b911/volumes" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.431542 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6g2z7"] Jan 22 14:29:40 crc kubenswrapper[4801]: E0122 14:29:40.432578 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="extract-content" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.432599 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="extract-content" Jan 22 14:29:40 crc kubenswrapper[4801]: E0122 14:29:40.432618 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="registry-server" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.432626 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="registry-server" Jan 22 14:29:40 crc kubenswrapper[4801]: E0122 14:29:40.432636 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="extract-utilities" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.432651 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="extract-utilities" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.432858 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3064b52-aa97-40f9-b8a5-5493a9e4b911" containerName="registry-server" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.434513 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.445865 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6g2z7"] Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.539486 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7kg\" (UniqueName: \"kubernetes.io/projected/4bc041f3-eb5b-44d8-887f-baacb41d6570-kube-api-access-hn7kg\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.539562 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-utilities\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.539581 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-catalog-content\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.641284 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7kg\" (UniqueName: \"kubernetes.io/projected/4bc041f3-eb5b-44d8-887f-baacb41d6570-kube-api-access-hn7kg\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.641781 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-utilities\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.641833 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-catalog-content\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.642568 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-catalog-content\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.642677 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-utilities\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.665218 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7kg\" (UniqueName: \"kubernetes.io/projected/4bc041f3-eb5b-44d8-887f-baacb41d6570-kube-api-access-hn7kg\") pod \"redhat-marketplace-6g2z7\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:40 crc kubenswrapper[4801]: I0122 14:29:40.767921 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:41 crc kubenswrapper[4801]: I0122 14:29:41.212124 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6g2z7"] Jan 22 14:29:41 crc kubenswrapper[4801]: I0122 14:29:41.581260 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:29:41 crc kubenswrapper[4801]: E0122 14:29:41.581506 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:29:41 crc kubenswrapper[4801]: I0122 14:29:41.679514 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6g2z7" event={"ID":"4bc041f3-eb5b-44d8-887f-baacb41d6570","Type":"ContainerStarted","Data":"155a50af9ef48a8d148fcd2b3c3b1fa81cd2289bb7ee5663d4973d63a29a437f"} Jan 22 14:29:42 crc kubenswrapper[4801]: I0122 14:29:42.687737 4801 generic.go:334] "Generic (PLEG): container finished" podID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerID="deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b" exitCode=0 Jan 22 14:29:42 crc kubenswrapper[4801]: I0122 14:29:42.687799 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6g2z7" event={"ID":"4bc041f3-eb5b-44d8-887f-baacb41d6570","Type":"ContainerDied","Data":"deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b"} Jan 22 14:29:44 crc kubenswrapper[4801]: I0122 14:29:44.731087 4801 generic.go:334] "Generic (PLEG): container finished" podID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerID="8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13" exitCode=0 Jan 22 14:29:44 crc kubenswrapper[4801]: I0122 14:29:44.731200 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6g2z7" event={"ID":"4bc041f3-eb5b-44d8-887f-baacb41d6570","Type":"ContainerDied","Data":"8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13"} Jan 22 14:29:46 crc kubenswrapper[4801]: I0122 14:29:46.757114 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6g2z7" event={"ID":"4bc041f3-eb5b-44d8-887f-baacb41d6570","Type":"ContainerStarted","Data":"afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01"} Jan 22 14:29:46 crc kubenswrapper[4801]: I0122 14:29:46.780571 4801 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6g2z7" podStartSLOduration=3.739520898 podStartE2EDuration="6.780550347s" podCreationTimestamp="2026-01-22 14:29:40 +0000 UTC" firstStartedPulling="2026-01-22 14:29:42.692020217 +0000 UTC m=+1531.393920400" lastFinishedPulling="2026-01-22 14:29:45.733049656 +0000 UTC m=+1534.434949849" observedRunningTime="2026-01-22 14:29:46.778590501 +0000 UTC m=+1535.480490694" watchObservedRunningTime="2026-01-22 14:29:46.780550347 +0000 UTC m=+1535.482450540" Jan 22 14:29:50 crc kubenswrapper[4801]: I0122 14:29:50.768820 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:50 crc kubenswrapper[4801]: I0122 14:29:50.769207 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:50 crc kubenswrapper[4801]: I0122 14:29:50.814373 4801 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:50 crc kubenswrapper[4801]: I0122 14:29:50.859938 4801 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:51 crc kubenswrapper[4801]: I0122 14:29:51.067434 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6g2z7"] Jan 22 14:29:52 crc kubenswrapper[4801]: I0122 14:29:52.800282 4801 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6g2z7" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="registry-server" containerID="cri-o://afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01" gracePeriod=2 Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.263279 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.270062 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-utilities\") pod \"4bc041f3-eb5b-44d8-887f-baacb41d6570\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.270140 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn7kg\" (UniqueName: \"kubernetes.io/projected/4bc041f3-eb5b-44d8-887f-baacb41d6570-kube-api-access-hn7kg\") pod \"4bc041f3-eb5b-44d8-887f-baacb41d6570\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.270197 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-catalog-content\") pod \"4bc041f3-eb5b-44d8-887f-baacb41d6570\" (UID: \"4bc041f3-eb5b-44d8-887f-baacb41d6570\") " Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.275337 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-utilities" (OuterVolumeSpecName: "utilities") pod "4bc041f3-eb5b-44d8-887f-baacb41d6570" (UID: "4bc041f3-eb5b-44d8-887f-baacb41d6570"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.281436 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc041f3-eb5b-44d8-887f-baacb41d6570-kube-api-access-hn7kg" (OuterVolumeSpecName: "kube-api-access-hn7kg") pod "4bc041f3-eb5b-44d8-887f-baacb41d6570" (UID: "4bc041f3-eb5b-44d8-887f-baacb41d6570"). InnerVolumeSpecName "kube-api-access-hn7kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.322193 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bc041f3-eb5b-44d8-887f-baacb41d6570" (UID: "4bc041f3-eb5b-44d8-887f-baacb41d6570"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.371456 4801 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.371756 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn7kg\" (UniqueName: \"kubernetes.io/projected/4bc041f3-eb5b-44d8-887f-baacb41d6570-kube-api-access-hn7kg\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.371847 4801 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bc041f3-eb5b-44d8-887f-baacb41d6570-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.808837 4801 generic.go:334] "Generic (PLEG): container finished" podID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerID="afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01" exitCode=0 Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.808888 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6g2z7" event={"ID":"4bc041f3-eb5b-44d8-887f-baacb41d6570","Type":"ContainerDied","Data":"afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01"} Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.808921 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6g2z7" event={"ID":"4bc041f3-eb5b-44d8-887f-baacb41d6570","Type":"ContainerDied","Data":"155a50af9ef48a8d148fcd2b3c3b1fa81cd2289bb7ee5663d4973d63a29a437f"} Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.808940 4801 scope.go:117] "RemoveContainer" containerID="afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.809050 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6g2z7" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.832632 4801 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6g2z7"] Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.835532 4801 scope.go:117] "RemoveContainer" containerID="8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.838979 4801 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6g2z7"] Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.863700 4801 scope.go:117] "RemoveContainer" containerID="deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.887319 4801 scope.go:117] "RemoveContainer" containerID="afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01" Jan 22 14:29:53 crc kubenswrapper[4801]: E0122 14:29:53.887779 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01\": container with ID starting with afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01 not found: ID does not exist" containerID="afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.887811 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01"} err="failed to get container status \"afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01\": rpc error: code = NotFound desc = could not find container \"afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01\": container with ID starting with afeddeb6b66a1a371b74d6a037b636e4bb90317cc8d4c399ce76647e42e4ff01 not found: ID does not exist" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.887834 4801 scope.go:117] "RemoveContainer" containerID="8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13" Jan 22 14:29:53 crc kubenswrapper[4801]: E0122 14:29:53.888185 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13\": container with ID starting with 8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13 not found: ID does not exist" containerID="8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.888208 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13"} err="failed to get container status \"8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13\": rpc error: code = NotFound desc = could not find container \"8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13\": container with ID starting with 8d2eb76b1f82321188241a9b33cdc9c36b101e12db36e7730e0a8a523392be13 not found: ID does not exist" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.888220 4801 scope.go:117] "RemoveContainer" containerID="deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b" Jan 22 14:29:53 crc kubenswrapper[4801]: E0122 14:29:53.888482 4801 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b\": container with ID starting with deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b not found: ID does not exist" containerID="deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b" Jan 22 14:29:53 crc kubenswrapper[4801]: I0122 14:29:53.888506 4801 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b"} err="failed to get container status \"deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b\": rpc error: code = NotFound desc = could not find container \"deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b\": container with ID starting with deaac5d95540721e854be8c2eaf77204aa7c35b25525af0894d05f3f57c8984b not found: ID does not exist" Jan 22 14:29:55 crc kubenswrapper[4801]: I0122 14:29:55.571094 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:29:55 crc kubenswrapper[4801]: E0122 14:29:55.573117 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:29:55 crc kubenswrapper[4801]: I0122 14:29:55.582144 4801 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" path="/var/lib/kubelet/pods/4bc041f3-eb5b-44d8-887f-baacb41d6570/volumes" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.164704 4801 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq"] Jan 22 14:30:00 crc kubenswrapper[4801]: E0122 14:30:00.165044 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="registry-server" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.165057 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="registry-server" Jan 22 14:30:00 crc kubenswrapper[4801]: E0122 14:30:00.165075 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="extract-content" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.165081 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="extract-content" Jan 22 14:30:00 crc kubenswrapper[4801]: E0122 14:30:00.165094 4801 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="extract-utilities" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.165102 4801 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="extract-utilities" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.165239 4801 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc041f3-eb5b-44d8-887f-baacb41d6570" containerName="registry-server" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.165693 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.167359 4801 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.168912 4801 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.172737 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq"] Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.227634 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c44886e-7d48-47b0-9c9d-60c929ef700b-secret-volume\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.227701 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dnjd\" (UniqueName: \"kubernetes.io/projected/8c44886e-7d48-47b0-9c9d-60c929ef700b-kube-api-access-9dnjd\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.227757 4801 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c44886e-7d48-47b0-9c9d-60c929ef700b-config-volume\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.330077 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c44886e-7d48-47b0-9c9d-60c929ef700b-config-volume\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.330394 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c44886e-7d48-47b0-9c9d-60c929ef700b-secret-volume\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.330468 4801 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dnjd\" (UniqueName: \"kubernetes.io/projected/8c44886e-7d48-47b0-9c9d-60c929ef700b-kube-api-access-9dnjd\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.332227 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c44886e-7d48-47b0-9c9d-60c929ef700b-config-volume\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.336978 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c44886e-7d48-47b0-9c9d-60c929ef700b-secret-volume\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.350863 4801 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dnjd\" (UniqueName: \"kubernetes.io/projected/8c44886e-7d48-47b0-9c9d-60c929ef700b-kube-api-access-9dnjd\") pod \"collect-profiles-29484870-mdmfq\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.545061 4801 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:00 crc kubenswrapper[4801]: I0122 14:30:00.980545 4801 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq"] Jan 22 14:30:01 crc kubenswrapper[4801]: I0122 14:30:01.888421 4801 generic.go:334] "Generic (PLEG): container finished" podID="8c44886e-7d48-47b0-9c9d-60c929ef700b" containerID="2a7dbda61ce356fa1ee6c1e8237bf87cbed22b618bddf70b901ac521d8aaa89b" exitCode=0 Jan 22 14:30:01 crc kubenswrapper[4801]: I0122 14:30:01.888560 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" event={"ID":"8c44886e-7d48-47b0-9c9d-60c929ef700b","Type":"ContainerDied","Data":"2a7dbda61ce356fa1ee6c1e8237bf87cbed22b618bddf70b901ac521d8aaa89b"} Jan 22 14:30:01 crc kubenswrapper[4801]: I0122 14:30:01.888640 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" event={"ID":"8c44886e-7d48-47b0-9c9d-60c929ef700b","Type":"ContainerStarted","Data":"610243ec76edb68897584fb265d78988bc372a26bf5195a54cbb554012615ce3"} Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.166082 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.283869 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c44886e-7d48-47b0-9c9d-60c929ef700b-secret-volume\") pod \"8c44886e-7d48-47b0-9c9d-60c929ef700b\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.285032 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c44886e-7d48-47b0-9c9d-60c929ef700b-config-volume\") pod \"8c44886e-7d48-47b0-9c9d-60c929ef700b\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.285081 4801 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dnjd\" (UniqueName: \"kubernetes.io/projected/8c44886e-7d48-47b0-9c9d-60c929ef700b-kube-api-access-9dnjd\") pod \"8c44886e-7d48-47b0-9c9d-60c929ef700b\" (UID: \"8c44886e-7d48-47b0-9c9d-60c929ef700b\") " Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.285769 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c44886e-7d48-47b0-9c9d-60c929ef700b-config-volume" (OuterVolumeSpecName: "config-volume") pod "8c44886e-7d48-47b0-9c9d-60c929ef700b" (UID: "8c44886e-7d48-47b0-9c9d-60c929ef700b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.289755 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c44886e-7d48-47b0-9c9d-60c929ef700b-kube-api-access-9dnjd" (OuterVolumeSpecName: "kube-api-access-9dnjd") pod "8c44886e-7d48-47b0-9c9d-60c929ef700b" (UID: "8c44886e-7d48-47b0-9c9d-60c929ef700b"). InnerVolumeSpecName "kube-api-access-9dnjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.289884 4801 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c44886e-7d48-47b0-9c9d-60c929ef700b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8c44886e-7d48-47b0-9c9d-60c929ef700b" (UID: "8c44886e-7d48-47b0-9c9d-60c929ef700b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.385916 4801 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c44886e-7d48-47b0-9c9d-60c929ef700b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.385950 4801 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dnjd\" (UniqueName: \"kubernetes.io/projected/8c44886e-7d48-47b0-9c9d-60c929ef700b-kube-api-access-9dnjd\") on node \"crc\" DevicePath \"\"" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.385965 4801 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c44886e-7d48-47b0-9c9d-60c929ef700b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.913189 4801 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" event={"ID":"8c44886e-7d48-47b0-9c9d-60c929ef700b","Type":"ContainerDied","Data":"610243ec76edb68897584fb265d78988bc372a26bf5195a54cbb554012615ce3"} Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.913248 4801 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="610243ec76edb68897584fb265d78988bc372a26bf5195a54cbb554012615ce3" Jan 22 14:30:03 crc kubenswrapper[4801]: I0122 14:30:03.913340 4801 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484870-mdmfq" Jan 22 14:30:06 crc kubenswrapper[4801]: I0122 14:30:06.571763 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:30:06 crc kubenswrapper[4801]: E0122 14:30:06.572638 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:30:18 crc kubenswrapper[4801]: I0122 14:30:18.571779 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:30:18 crc kubenswrapper[4801]: E0122 14:30:18.572981 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" Jan 22 14:30:32 crc kubenswrapper[4801]: I0122 14:30:32.572259 4801 scope.go:117] "RemoveContainer" containerID="5819fcb545e1eb1fa4546b8328b955b6a91266f22427d0fa736405fcc92269df" Jan 22 14:30:32 crc kubenswrapper[4801]: E0122 14:30:32.575205 4801 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5t2tp_openshift-machine-config-operator(2b842046-5300-4281-9d73-3ae42f0d56da)\"" pod="openshift-machine-config-operator/machine-config-daemon-5t2tp" podUID="2b842046-5300-4281-9d73-3ae42f0d56da" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515134432225024446 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015134432226017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015134426602016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015134426603015461 5ustar corecore